Meera Pankhania: From funding to delivery - Ensuring alignment from start to finish
It’s a chicken and egg situation when it comes to securing funding for a large transformation program in government. On one hand, you need to submit a business case and, as part of that, you need to make early decisions about how you might approach and deliver the program of work. On the other hand, you need to know enough about the problem you are going to solve to ensure you have sufficient funding to understand the problem better, hire the right people, design the right service, and build it the right way.
Now imagine securing hundreds of millions of dollars to design and build a service, but not feeling confident about what the user needs are. What if you had the opportunity to change this common predicament and influence your leadership team to carry out alignment activities, all while successfully delivering within the committed time frames?
Meera Pankhania, Design Director and Co-founder of Propel Design, recently spoke at UX New Zealand, the leading UX and IA conference in New Zealand hosted by Optimal Workshop, on traceability and her learnings from delivering a $300 million Government program.
In her talk, Meera helps us understand how to use service traceability techniques in our work and apply them to any environment - ensuring we design and build the best service possible, no matter the funding model.
Background on Meera Pankhania
As a design leader, Meera is all about working on complex, purpose-driven challenges. She helps organizations take a human-centric approach to service transformation and helps deliver impactful, pragmatic outcomes while building capability and leading teams through growth and change.
Meera co-founded Propel Design, a strategic research, design, and delivery consultancy in late 2020. She has 15 years of experience in service design, inclusive design, and product management across the private, non-profit, and public sectors in both the UK and Australia.
Meera is particularly interested in policy and social design. After a stint in the Australian Public Service, Meera was appointed as a senior policy adviser to the NSW Minister for Customer Service, Hon. Victor Dominello MP. In this role, she played a part in NSW’s response to the COVID pandemic, flexing her design leadership skills in a new, challenging, and important context.
From funding to delivery: ensuring alignment from start to finish 🏁🎉👏
Meera’s talk explores a fascinating case study within the Department of Employment Services (Australia) where a substantial funding investment of around $300 million set the stage for a transformative journey. This funding supported the delivery of a revamped Employment Services Model, which had the goal of delivering better services to job seekers and employers, and a better system for providers within this system. The project had a focus on aligning teams prior to delivery, which resulted in a huge amount of groundwork for Meera.
Her journey involved engaging various stakeholders within the department, including executives, to understand the program as a whole and what exactly needed to be delivered. “Traceability” became the watchword for this project, which is laid out in three phases.
Phase 1: Aligning key deliverables
Phase 2: Ensuring delivery readiness
Phase 3: Building sustainable work practices
Phase 1: Aligning key deliverables 🧮
Research and discovery (pre-delivery)
Meera’s work initially meant conducting extensive research and engagement with executives, product managers, researchers, designers, and policymakers. Through this process, a common theme was identified – the urgent (and perhaps misguided) need to start delivering! Often, organizations focus on obtaining funding without adequately understanding the complexities involved in delivering the right services to the right users, leading to half-baked delivery.
After this initial research, some general themes started to emerge:
Assumptions were made that still needed validation
Teams weren’t entirely sure that they understood the user’s needs
A lack of holistic understanding of how much research and design was needed
The conclusion of this phase was that “what” needed to be delivered wasn’t clearly defined. The same was true for “how” it would be delivered.
Traceability
Meera’s journey heavily revolved around the concept of "traceability” and sought to ensure that every step taken within the department was aligned with the ultimate goal of improving employment services. Traceability meant having a clear origin and development path for every decision and action taken. This is particularly important when spending taxpayer dollars!
So, over the course of eight weeks (which turned out to be much longer), the team went through a process of combing through documents in an effort to bring everything together to make sense of the program as a whole. This involved some planning, user journey mapping, and testing and refinement.
Documenting Key Artifacts
Numerous artifacts and documents played a crucial role in shaping decisions. Meera and her team gathered and organized these artifacts, including policy requirements, legislation, business cases, product and program roadmaps, service maps, and blueprints. The team also included prior research insights and vision documents which helped to shape a holistic view of the required output.
After an effort of combing through the program documents and laying everything out, it became clear that there were a lot of gaps and a LOT to do.
Prioritising tasks
As a result of these gaps, a process of task prioritization was necessary. Tasks were categorized based on a series of factors and then mapped out based on things like user touch points, pain points, features, business policy, and technical capabilities.
This then enabled Meera and the team to create Product Summary Tiles. These tiles meant that each product team had its own summary ahead of a series of planning sessions. It gave them as much context (provided by the traceability exercise) as possible to help with planning. Essentially, these tiles provided teams with a comprehensive overview of their projects i.e. what their user needs, what certain policies require them to deliver, etc.
Phase 2: Ensuring delivery readiness 🙌🏻
Meera wanted every team to feel confident that we weren’t doing too much or too little in order to design and build the right service, the right way.
Standard design and research check-ins were well adopted, which was a great start, but Meera and the team also built a Delivery Readiness Tool. It was used to assess a team's readiness to move forward with a project. This tool includes questions related to the development phase, user research, alignment with the business case, consideration of policy requirements, and more. Ultimately, it ensures that teams have considered all necessary factors before progressing further.
Phase 3: Building sustainable work practices 🍃
As the program progressed, several sustainable work practices emerged which Government executives were keen to retain going forward.
Some of these included:
ResearchOps Practice: The team established a research operations practice, streamlining research efforts and ensuring that ongoing research was conducted efficiently and effectively.
Consistent Design Artifacts: Templates and consistent design artifacts were created, reducing friction and ensuring that teams going forward started from a common baseline.
Design Authority and Ways of Working: A design authority was established to elevate and share best practices across the program.
Centralized and Decentralized Team Models: The program showcased the effectiveness of a combination of centralized and decentralized team models. A central design team provided guidance and support, while service design leads within specific service lines ensured alignment and consistency.
Why it matters 🔥
Meera's journey serves as a valuable resource for those working on complex design programs, emphasizing the significance of aligning diverse stakeholders and maintaining traceability. Alignment and traceability are critical to ensuring that programs never lose sight of the problem they’re trying to solve, both from the user and organization’s perspective. They’re also critical to delivering on time and within budget!
Traceability key takeaways 🥡
Early Alignment Matters: While early alignment is ideal, it's never too late to embark on a traceability journey. It can uncover gaps, increase confidence in decision-making, and ensure that the right services are delivered.
Identify and audit: You never know what artifacts will shape your journey. Identify everything early, and don’t be afraid to get clarity on things you’re not sure about.
Conducting traceability is always worthwhile: Even if you don’t find many gaps in your program, you will at least gain a high level of confidence that your delivery is focused on the right things.
Delivery readiness key takeaways 🥡
Skills Mix is Vital: Assess and adapt team member roles to match their skills and experiences, ensuring they are positioned optimally.
Not Everyone Shares the Same Passion: Recognize that not everyone will share the same level of passion for design and research. Make the relevance of these practices clear to all team members.
Sustainability key takeaways 🥡
One Size Doesn't Fit All: Tailor methodologies, templates, and practices to the specific needs of your organization.
Collaboration is Key: Foster a sense of community and collective responsibility within teams, encouraging shared ownership of project outcomes.
At Optimal Workshop, we're dedicated to building the best user research platform to empower you with the tools to better understand your customers and create intuitive digital experiences. We're thrilled to announce some game-changing updates and new products that are on the horizon to help elevate the way you gather insights and keep customers at the heart of everything you do.
What’s new…
Integration with Figma 🚀
Last month, we joined forces with design powerhouse Figma to launch our integration. You can import images from Figma into Chalkmark (our click-testing tool) in just a few clicks, streamlining your workflows and getting insights to make decisions based on data not hunches and opinions.
What’s coming next…
Session Replays 🧑💻
With session replay you can focus on other tasks while Optimal Workshop automatically captures card sort sessions for you to watch in your own time. Gain valuable insights into how participants engage and interpret a card sort without the hassle of running moderated sessions. The first iteration of session replays captures the study interactions, and will not include audio or face recording, but this is something we are exploring for future iterations. Session replays will be available in tree testing and click-testing later in 2024.
Reframer Transcripts 🔍
Say goodbye to juggling note-taking and hello to more efficient ways of working with Transcripts! We're continuing to add more capability to Reframer, our qualitative research tool, to now include the importing of interview transcripts. Save time, reduce human errors and oversights by importing transcripts, tagging and analyzing observations all within Reframer. We’re committed to build on transcripts with video and audio transcription capability in the future, we’ll keep you in the loop and when to expect those releases.
Prototype testing 🧪
The team is fizzing to be working on a new Prototype testing product designed to expand your research methods and help test prototypes easily from the Optimal Workshop platform. Testing prototypes early and often is an important step in the design process, saving you time and money before you invest too heavily in the build. We are working with customers and on delivering the first iteration of this exciting new product. Stay tuned for Prototypes coming in the second quarter of 2024.
Workspaces 🎉
Making Optimal Workshop easier for large organizations to manage teams and collaborate more effectively on projects is a big focus for 2024. Workspaces are the first step towards empowering organizations to better manage multiple teams with projects. Projects will allow greater flexibility on who can see what, encouraging working in the open and collaboration alongside the ability to make projects private. The privacy feature is available on Enterprise plans.
Questions upgrade❓
Our survey product Questions is in for a glow up in 2024 💅. The team are enjoying working with customers, collecting and reviewing feedback on how to improve Questions and will be sharing more on this in the coming months.
Help us build a better Optimal Workshop
We are looking for new customers to join our research panel to help influence product development. From time to time, you’ll be invited to join us for interviews or surveys, and you’ll be rewarded for your time with a thank-you gift. If you’d like to join the team, email product@optimalworkshop.com
Last week Optimal Workshop was delighted to sponsor UXDX USA 2024 in New York. The User Experience event brings together Product, Design, UX, CX, and Engineering professionals and our team had an amazing time meeting with customers, industry experts, and colleagues throughout the conference. This year, we also had the privilege of sharing some of our industry expertise by running an interactive forum on “Measuring the Value of UX Research” - a topic very close to our hearts.
Our forum, hosted by Optimal Workshop CEO Alex Burke and Product Lead Ella Fielding, was focused on exploring the value of User Experience Research (UXR) from both an industry-wide perspective and within the diverse ecosystem of individual companies and teams conducting this type of research today.
The session brought together a global mix of UX professionals for a rich discussion on measuring and demonstrating the effectiveness of and the challenges facing organizations who are trying to tie UXR to tangible business value today.
The main topics for the discuss were:
Metrics that Matter: How do you measure UXR's impact on sales, customer satisfaction, and design influence?
Challenges & Strategies: What are the roadblocks to measuring UXR impact, and how can we overcome them?
Beyond ROI: UXR's value beyond just financial metrics
Some of the key takeaways from our discussions during the session were:
The current state of UX maturity and value
Many UX teams don’t measure the impact of UXR on core business metrics and there were more attendees who are not measuring the impact of their work than those that are measuring it.
Alex & Ella discussed with the attendees the current state of UX research maturity and the ability to prove value across different organizations represented in the room. Most organizations were still early in their UX research maturity with only 5% considering themselves advanced in having research culturally embedded.
Defining and proving the value of UX research
The industry doesn’t have clear alignment or understanding of what good measurement looks like. Many teams don’t know how to accurately measure UXR impact or don’t have the tools or platforms to measure it, which serve as core roadblocks for measuring UXRs’ impact.
Alex and Ella discussed challenges in defining and proving the value of UX research, with common values being getting closer to customers, innovating faster, de-risking product decisions, and saving time and money. However, the value of research is hard to quantify compared to other product metrics like lines of code or features shipped.
Measuring and advocating for UX research
When teams are measuring UXR today there is a strong bias for customer feedback, but little ability or understanding about how to measure impact on business metrics like revenue.
The most commonly used metrics for measuring UXR are quantitative and qualitative feedback from customers as opposed to internal metrics like stakeholder involvement or tieing UXR to business performance metrics (including financial performance).
Attendees felt that in organizations where research is more embedded, researchers spend significant time advocating for research and proving its value to stakeholders rather than just conducting studies. This included tactics like research repositories and pointing to past study impacts as well as ongoing battles to shape decision making processes.
One of our attendees highlighted that engaging stakeholders in the process of defining key research metrics prior to running research was a key for them in proving value internally.
Relating user research to financial impact
Alex and Ella asked the audience if anyone had examples of demonstrating financial impact of research to justify investment in the team and we got some excellent examples from the audience proving that there are tangible ways to tie research outcomes to core business metrics including:
Calculating time savings for employees from internal tools as a financial impact metric.
Measuring a reduction in calls to service desks as a way to quantify financial savings from research.
Most attendees recognise the value in embedding UXR more deeply in all levels of their organization - but feel like they’re not succeeding at this today.
Most attendees feel that UXR is not fully embedded in their orgnaization or culture, but that if it was - they would be more successful in proving its overall value.
Stakeholder buy-in and engagement with UXR, particularly from senior leadership varied enormously across organizations, and wasn’t regularly measured as an indicator of UXR value
In organizations where research was more successfully embedded, researchers had to spend significant time and effort building relationships with internal stakeholders before and after running studies. This took time and effort away from actual research, but ended up making the research more valuable to the business in the long run.
With the large range of UX maturity and the democratization of research across teams, we know there’s a lot of opportunity for our customers to improve their ability to tie their user research to tangible business outcomes and embed UX more deeply in all levels of their organizations. To help fill this gap, Optimal Workshop is currently running a large research project on Measuring the Value of UX which will be released in a few weeks.
User research is invaluable, but in fast-paced environments, researchers often struggle with tight deadlines, limited resources, and the need to prove their impact. In our recent UX Insider webinar, Weidan Li, Senior UX Researcher at Seek, shared insights on Efficient Research—an approach that optimizes Speed, Quality, and Impact to maximize the return on investment (ROI) of understanding customers.
At the heart of this approach is the Efficient Research Framework, which balances these three critical factors:
Speed – Conducting research quickly without sacrificing key insights.
Quality – Ensuring rigor and reliability in findings.
Impact – Making sure research leads to meaningful business and product changes.
Within this framework, Weidan outlined nine tactics that help UX researchers work more effectively. Let’s dive in.
1. Time Allocation: Invest in What Matters Most
Not all research requires the same level of depth. Efficient researchers prioritize their time by categorizing projects based on urgency and impact:
High-stakes decisions (e.g., launching a new product) require deep research.
Routine optimizations (e.g., tweaking UI elements) can rely on quick testing methods.
Low-impact changes may not need research at all.
By allocating time wisely, researchers can avoid spending weeks on minor issues while ensuring critical decisions are well-informed.
2. Assistance of AI: Let Technology Handle the Heavy Lifting
AI is transforming UX research, enabling faster and more scalable insights. Weidan suggests using AI to:
Automate data analysis – AI can quickly analyze survey responses, transcripts, and usability test results.
Generate research summaries – Tools like ChatGPT can help synthesize findings into digestible insights.
Speed up recruitment – AI-powered platforms can help find and screen participants efficiently.
While AI can’t replace human judgment, it can free up researchers to focus on higher-value tasks like interpreting results and influencing strategy.
3. Collaboration: Make Research a Team Sport
Research has a greater impact when it’s embedded into the product development process. Weidan emphasizes:
Co-creating research plans with designers, PMs, and engineers to align on priorities.
Involving stakeholders in synthesis sessions so insights don’t sit in a report.
Encouraging non-researchers to run lightweight studies, such as A/B tests or quick usability checks.
When research is shared and collaborative, it leads to faster adoption of insights and stronger decision-making.
4. Prioritization: Focus on the Right Questions
With limited resources, researchers must choose their battles wisely. Weidan recommends using a prioritization framework to assess:
Business impact – Will this research influence a high-stakes decision?
User impact – Does it address a major pain point?
Feasibility – Can we conduct this research quickly and effectively?
By filtering out low-priority projects, researchers can avoid research for research’s sake and focus on what truly drives change.
5. Depth of Understanding: Go Beyond Surface-Level Insights
Speed is important, but efficient research isn’t about cutting corners. Weidan stresses that even quick studies should provide a deep understanding of users by:
Asking why, not just what – Observing behavior is useful, but uncovering motivations is key.
Using triangulation – Combining methods (e.g., usability tests + surveys) to validate findings.
Revisiting past research – Leveraging existing insights instead of starting from scratch.
Balancing speed with depth ensures research is not just fast, but meaningful.
6. Anticipation: Stay Ahead of Research Needs
Proactive researchers don’t wait for stakeholders to request studies—they anticipate needs and set up research ahead of time. This means:
Building a research roadmap that aligns with upcoming product decisions.
Running continuous discovery research so teams have a backlog of insights to pull from.
Creating self-serve research repositories where teams can find relevant past studies.
By anticipating research needs, UX teams can reduce last-minute requests and deliver insights exactly when they’re needed.
7. Justification of Methodology: Explain Why Your Approach Works
Stakeholders may question research methods, especially when they seem time-consuming or expensive. Weidan highlights the importance of educating teams on why specific methods are used:
Clearly explain why qualitative research is needed when stakeholders push for just numbers.
Show real-world examples of how past research has led to business success.
Provide a trade-off analysis (e.g., “This method is faster but provides less depth”) to help teams make informed choices.
A well-justified approach ensures research is respected and acted upon.
8. Individual Engagement: Tailor Research Communication to Your Audience
Not all stakeholders consume research the same way. Weidan recommends adapting insights to fit different audiences:
Executives – Focus on high-level impact and key takeaways.
Product teams – Provide actionable recommendations tied to specific features.
Designers & Engineers – Share usability findings with video clips or screenshots.
By delivering insights in the right format, researchers increase the likelihood of stakeholder buy-in and action.
9. Business Actions: Ensure Research Leads to Real Change
The ultimate goal of research is not just understanding users—but driving business decisions. To ensure research leads to action:
Follow up on implementation – Track whether teams apply the insights.
Tie findings to key metrics – Show how research affects conversion rates, retention, or engagement.
Advocate for iterative research – Encourage teams to re-test and refine based on new data.
Research is most valuable when it translates into real business outcomes.
Final Thoughts: Research That Moves the Needle
Efficient research is not just about doing more, faster—it’s about balancing speed, quality, and impact to maximize its influence. Weidan’s nine tactics help UX researchers work smarter by:
✔️ Prioritizing high-impact work ✔️ Leveraging AI and collaboration ✔️ Communicating research in a way that drives action
By adopting these strategies, UX teams can ensure their research is not just insightful, but transformational.