March 12, 2024
6 min

Dan Dixon and Stéphan Willemse: HCD is dead, long live HCD

There is strong backlash about the perceived failures of Human Centred Design (HCD) and its contribution to contemporary macro problems. There seems to be a straightforward connection: HCD and Design Thinking have been adopted by organizations and are increasingly part of product/experience development, especially in big tech. However the full picture is more complex, and HCD does have some issues.

Dan Dixon, UX and Design Research Director and Stéphan Willemse, Strategy Director/Head of Strategy, both from the Digital Arts Network, recently spoke at UX New Zealand, the leading UX and IA conference in New Zealand hosted by Optimal Workshop, about the evolution and future of HCD.

In their talk, Dan and Stéphan cover the history of HCD, its use today, and its limitations, before presenting a Post HCD future. What could it be, and how should it be different? Dan and Stéphan help us to step outside of ourselves as we meet new problems with new ways of Design Thinking.

Dan Dixon and Stéphan Willemse bios

Dan is a long-term practitioner of human-centred experience design and has a wealth of experience in discovery and qual research. He’s worked in academic, agency and client-side roles in both the UK and NZ, covering diverse fields such as digital, product design, creative technology and game design. His history has blended a background in the digital industry with creative technology teaching and user experience research. He has taken pragmatic real-world knowledge into a higher education setting as well as bringing deeper research skills from academia into commercial design projects. In higher education, as well as talks and workshops, Dan has been teaching and sharing these skills for the last 16 years. 

Stéphan uses creativity, design and strategy to help organizations innovate towards positive, progressive futures. He works across innovation, experience design, emerging technologies, cultural intelligence and futures projects with clients including Starbucks, ANZ, Countdown, TradeMe and the public sector. He holds degrees in PPE, Development Studies, Education and an Executive MBA. However, he doesn’t like wearing a suit and his idea of the perfect board meeting is at a quiet surf break. He thinks ideas are powerful and that his young twins ask the best questions about the world we live in.

Contact Details:

Email: dan.dixon@digitalartsnetwork.com

Find Dan on LinkedIn  

HCD IS DEAD, LONG LIVE HCD 👑

Dan and Stéphan take us through the evolving landscape of Human Centred Design (HCD) and Design Thinking. Can HCD effectively respond to the challenges of the modern era, and can we get ahead of the unintended consequences of our design? They examine the inputs and processes of design, not just the output, to scrutinize the very essence of design practice.

A brief history of HCD

In the 1950s and 1960s, designers began exploring the application of scientific processes to design, aiming to transform it into a systematic problem-solving approach. Later in the 1960s, design thinkers in Scandinavia initiated the shift towards cooperative and participative design practices. Collaboration and engagement with diverse stakeholders became integral to design processes. Then, the 1970s and 1980s marked a shift in perspective, viewing design as a fundamentally distinct way of approaching problems. 

Moving into the late 1980s and 1990s, design thinking expanded to include user-centered design, and the idea of humans and technology becoming intertwined. Then the 2000s witnessed a surge in design thinking, where human-centered design started to make its mark.

Limitations of the “design process”

Dan and Stéphan discuss the “design squiggle”, a concept that portrays the messy and iterative nature of design, starting chaotically and gradually converging toward a solution. For 20 years, beginning in the early 90s, this was a popular way to explain how the design process feels. However, in the past 10 years or so, efforts to teach and pass down design processes have become common practice. Here enter concepts like the “double diamond” and “pattern problem”, which seek to be repeatable and process-driven. These neat processes, however, demand rigid adherence to specific design methods, which can ultimately stifle innovation. 

Issues with HCD and its evolution

The critique of such rigid design processes, which developed alongside HCD, highlights the need to acknowledge that humans are just one element in an intricate network of actors. By putting ourselves at the center of our design processes and efforts, we already limit our design. Design is just as much about the ecosystem surrounding any given problem as it is about the user. A limitation of HCD is that we humans are not actually at the center of anything except our own minds. So, how can we address this limitation?

Post-anthropocentric design starts to acknowledge that we are far less rational than we believe ourselves to be. It captures the idea that there are no clear divisions between ‘being human’ and everything else. This concept has become important as we adopt more and more technology into our lives, and we’re getting more enmeshed in it. 

Post-human design extends this further by removing ourselves from the center of design and empathizing with “things”, not just humans. This concept embraces the complexity of our world and emphasizes how we need to think about the problem just as much as we think about the solution. In other words, post-human design encourages us to “live” in our design problem(s) and consider multiple interventions.

Finally, Dan and Stéphan discuss the concept of Planetary design, which stresses that everything we create, and everything we do, has the possibility to impact everything else in the world. In fact, our designs do impact everything else, and we need to try and be aware of all possibilities.

Integrating new ways of thinking about design

To think beyond HCD and to foster innovation in design, we can begin by embracing emerging design practices and philosophies such as "life-centered design," "Society-centered design," and "Humanity-centered design." These emerging practices have toolsets that are readily available online and can be seamlessly integrated into your design approach, helping us to break away from traditional, often linear, methodologies. Or, taking a more proactive stance, we can craft our own unique design tools and frameworks. 

Why it matters 🎯

To illustrate how design processes can evolve to meet current and future challenges of our time, Dan and Stéphan present their concept of “Post human-centered design” (Post HCD). At its heart, it seeks to take what's great about HCD and build upon it, all while understanding its issues/limitations.

Dan and Stéphan put forward, as a starting point, some challenges for designers to consider as we move our practice to its next phase.

Suggested Post HCD principles:

  • Human to context: Moving from human-centered to a context-centred or context sensitive point of view.
  • Design Process to Design Behaviour: Not being beholden to design processes like the “double diamond”. Instead of thinking about designing for problems, we should design for behaviors instead. 
  • Problem-solutions to Interventions: Thinking more broadly about interventions in the problem space, rather than solutions to the problems
  • Linear to Dynamic: Understand ‘networks’ and complex systems.
  • Repeated to Reflexive: Challenging status quo processes and evolving with challenges that we’re trying to solve.

The talk wraps up by encouraging designers to incorporate some of this thinking into everyday practice. Some key takeaways are: 

  • Expand your web of context: Don’t just think about things having a center, think about networks.
  • Have empathy for “things”: Consider how you might then have empathy for all of those different things within that network, not just the human elements of the network.
  • Design practice is exploration and design exploration is our practice: Ensure that we're exploring both our practice as well as the design problem.
  • Make it different every time: Every time we design, try to make it different, don't just try and repeat the same loop over and over again.

Share this article
Author
Optimal
Workshop

Related articles

View all blog articles
Learn more
1 min read

Avoiding bias in the oh-so-human world of user testing

"Dear Optimal WorkshopMy question is about biasing users with the wording of questions. It seems that my co-workers and I spend too much time debating the wording of task items in usability tests or questions on surveys. Do you have any 'best practices' for wordings that evoke unbiased feedback from users?" — Dominic

Dear Dominic, Oh I feel your pain! I once sat through a two hour meeting that was dominated by a discussion on the merits of question marks!It's funny how wanting to do right by users and clients can tangle us up like fine chains in an old jewellery box. In my mind, we risk provoking bias when any aspect of our research (from question wording to test environment) influences participants away from an authentic response. So there are important things to consider outside of the wording of questions as well. I'll share my favorite tips, and then follow it up with a must-read resource or two.

Balance your open and closed questions

The right balance of open and closed questions is essential to obtaining unbiased feedback from your users. Ask closed questions only when you want a very specific answer like 'How old are you?' or 'Are you employed?' and ask open questions when you want to gain an understanding of what they think or feel. For example, don’t ask the participant'Would you be pleased with that?' (closed question). Instead, ask 'How do you feel about that?' or even better 'How do you think that might work?' Same advice goes for surveys, and be sure to give participants enough space to respond properly — fifty characters isn’t going to cut it.

Avoid using words that are linked to an emotion

The above questions lead me to my next point — don’t use words like ‘happy’. Don’t ask if they like or dislike something. Planting emotion based words in a survey or usability test is an invite for them to tell you what they think you want to hear . No one wants to be seen as being disagreeable. If you word a question like this, chances are they will end up agreeing with the question itself, not the content or meaning behind it...does that make sense? Emotion based questions only serve to distract from the purpose of the testing — leave them at home.

Keep it simple and avoid jargon

No one wants to look stupid by not understanding the terms used in the question. If it’s too complicated, your user might just agree or tell you what they think you want to hear to avoid embarrassment. Another issue with jargon is that some terms may have multiple meanings which can trigger a biased reaction depending on the user’s understanding of the term. A friend of mine once participated in user testing where they were asked if what they were seeing made them feel ‘aroused’. From a psychology perspective, that means you’re awake and reacting to stimuli.

From the user's perspective? I’ll let you fill in the blanks on that one. Avoid using long, wordy sentences when asking questions or setting tasks in surveys and usability testing. I’ve seen plenty of instances of overly complicated questions that make the user tune out (trust me, you would too!). And because people don't tend to admit their attention has wandered during a task, you risk getting a response that lacks authenticity — maybe even one that aims to please (just a thought...).

Encourage participants to share their experiences (instead of tying them up in hypotheticals)

Instead of asking your user what they think they would do in a given scenario, ask them to share an example of a time when they actually did do it. Try asking questions along the lines of 'Can you tell me about a time when you….?' or 'How many times in the last 12 months have you...?' Asking them to recall an experience they had allows you to gain factual insights from your survey or usability test, not hypothetical maybes that are prone to bias.

Focus the conversation by asking questions in a logical order

If you ask usability testing or survey questions in an order that doesn’t quite follow a logical flow, the user may think that the order holds some sort of significance which in turn may change the way they respond. It’s a good idea to ensure that the questions tell a story and follow a logical progression for example the steps in a process — don’t ask me if I’d be interested in registering for a service if you haven’t introduced the concept yet (you’d be surprised how often this happens!). For further reading on this, be sure to check out this great article from usertesting.com.

More than words — the usability testing experience as a whole

Reducing bias by asking questions the right way is really just one part of the picture. You can also reduce bias by influencing the wider aspects of the user testing process, and ensuring the participant is comfortable and relaxed.

Don’t let the designer facilitate the testing

This isn’t always possible, but it’s a good idea to try to get someone else to facilitate the usability testing on your design (and choose to observe if you like). This will prevent you from bringing your own bias into the room, and participants will be more comfortable being honest when the designer isn't asking the questions. I've seen participants visibly relax when I've told them I'm not the designer of a particular website, when it's apparent they've arrived expecting that to be the case.

Minimize discomfort and give observers a role

The more comfortable your participants are, with both the tester and the observer, the more they can be themselves. There are labs out there with two-way mirrors to hide observers, but in all honesty the police interrogation room isn’t always the greatest look! I prefer to have the observer in the testing room, while being conscious that participants may instinctively be uncomfortable with being observed. I’ve seen observer guidelines that insist observers (in the room) stay completely silent the entire time, but I think that can be pretty creepy for participants! Here's what works best (in my humble opinion).

The facilitator leads the testing session, of course, but the observer is able to pipe up occasionally, mostly for clarification purposes, and certainly join in the welcoming, 'How's the weather?' chit chat before the session begins. In fact, when I observe usability testing, I like to be the one who collects the participant from the foyer. I’m the first person they see and it’s my job to make them feel welcome and comfortable, so when they find out I'll be observing, they know me already. Anything you can do to make the participant feel at home will increase the authenticity of their responses.

A note to finish

At the end of the day the reality is we’re all susceptible to bias. Despite your best efforts you’re never going to eradicate it completely, but just being aware of and understanding it goes a long way to reducing its impacts. Usability testing is, after all, something we design. I’ll leave you with this quote from Jeff Sauro's must-read article on 9 biases to watch out for in usability testing:

"We do the best we can to simulate a scenario that is as close to what users would actually do .... However, no amount of realism in the tasks, data, software or environment can change the fact that the whole thing is contrived. This doesn't mean it's not worth doing."

Learn more
1 min read

Different ways to test information architecture

We all know that building a robust information architecture (IA) can make or break your product. And getting it right can rely on robust user research. Especially when it comes to creating human-centered, intuitive products that deliver outstanding user experiences.

But what are the best methods to test your information architecture? To make sure that your focus is on building an information architecture that is truly based on what your users want, and need.

What is user research? 🗣️🧑🏻💻

With all the will in the world, your product (or website or mobile app) may work perfectly and be as intuitive as possible. But, if it is only built on information from your internal organizational perspective, it may not measure up in the eyes of your user. Often, organizations make major design decisions without fully considering their users. User research (UX) backs up decisions with data, helping to make sure that design decisions are strategic decisions. 

Testing your information architecture can also help establish the structure for a better product from the ground up. And ultimately, the performance of your product. User experience research focuses your design on understanding your user expectations, behaviors, needs, and motivations. It is an essential part of creating, building, and maintaining great products. 

Taking the time to understand your users through research can be incredibly rewarding with the insights and data-backed information that can alter your product for the better. But what are the key user research methods for your information architecture? Let’s take a look.

Research methods for information architecture ⚒️

There is more than one way to test your IA. And testing with one method is good, but with more than one is even better. And, of course, the more often you test, especially when there are major additions or changes, you can tweak and update your IA to improve and delight your user’s experience.

Card Sorting 🃏

Card sorting is a user research method that allows you to discover how users understand and categorize information. It’s particularly useful when you are starting the planning process of your information architecture or at any stage you notice issues or are making changes. Putting the power into your users’ hands and asking how they would intuitively sort the information. In a card sort, participants sort cards containing different items into labeled groups. You can use the results of a card sort to figure out how to group and label the information in a way that makes the most sense to your audience. 

There are a number of techniques and methods that can be applied to a card sort. Take a look here if you’d like to know more.

Card sorting has many applications. It’s as useful for figuring out how content should be grouped on a website or in an app as it is for figuring out how to arrange the items in a retail store.You can also run a card sort in person, using physical cards, or remotely with online tools such as OptimalSort.

Tree Testing 🌲

Taking a look at your information architecture from the other side can also be valuable. Tree testing is a usability method for evaluating the findability of topics on a product. Testing is done on a simplified text version of your site structure without the influence of navigation aids and visual design.

Tree testing tells you how easily people can find information on your product and exactly where people get lost. Your users rely on your information architecture – how you label and organize your content – to get things done.

Tree testing can answer questions like:

  • Do my labels make sense to people?
  • Is my content grouped logically to people?
  • Can people find the information they want easily and quickly? If not, what’s stopping them?

Treejack is our tree testing tool and is designed to make it easy to test your information architecture. Running a tree test isn’t actually that difficult, especially if you’re using the right tool. You’ll  learn how to set useful objectives, how to build your tree, write your tasks, recruit participants, and measure results.

Combining information architecture research methods 🏗

If you are wanting a fully rounded view of your information architecture, it can be useful to combine your research methods.

Tree testing and card sorting, along with usability testing, can give you insights into your users and audience. How do they think? How do they find their way through your product? And how do they want to see things labeled, organized, and sorted? 

If you want to get fully into the comparison of tree testing and card sorting, take a look at our article here, which compares the options and explains which is best and when. 

Learn more
1 min read

Dan Dixon and Stéphan Willemse: HCD is dead, long live HCD

There is strong backlash about the perceived failures of Human Centred Design (HCD) and its contribution to contemporary macro problems. There seems to be a straightforward connection: HCD and Design Thinking have been adopted by organizations and are increasingly part of product/experience development, especially in big tech. However the full picture is more complex, and HCD does have some issues.

Dan Dixon, UX and Design Research Director and Stéphan Willemse, Strategy Director/Head of Strategy, both from the Digital Arts Network, recently spoke at UX New Zealand, the leading UX and IA conference in New Zealand hosted by Optimal Workshop, about the evolution and future of HCD.

In their talk, Dan and Stéphan cover the history of HCD, its use today, and its limitations, before presenting a Post HCD future. What could it be, and how should it be different? Dan and Stéphan help us to step outside of ourselves as we meet new problems with new ways of Design Thinking.

Dan Dixon and Stéphan Willemse bios

Dan is a long-term practitioner of human-centred experience design and has a wealth of experience in discovery and qual research. He’s worked in academic, agency and client-side roles in both the UK and NZ, covering diverse fields such as digital, product design, creative technology and game design. His history has blended a background in the digital industry with creative technology teaching and user experience research. He has taken pragmatic real-world knowledge into a higher education setting as well as bringing deeper research skills from academia into commercial design projects. In higher education, as well as talks and workshops, Dan has been teaching and sharing these skills for the last 16 years. 

Stéphan uses creativity, design and strategy to help organizations innovate towards positive, progressive futures. He works across innovation, experience design, emerging technologies, cultural intelligence and futures projects with clients including Starbucks, ANZ, Countdown, TradeMe and the public sector. He holds degrees in PPE, Development Studies, Education and an Executive MBA. However, he doesn’t like wearing a suit and his idea of the perfect board meeting is at a quiet surf break. He thinks ideas are powerful and that his young twins ask the best questions about the world we live in.

Contact Details:

Email: dan.dixon@digitalartsnetwork.com

Find Dan on LinkedIn  

HCD IS DEAD, LONG LIVE HCD 👑

Dan and Stéphan take us through the evolving landscape of Human Centred Design (HCD) and Design Thinking. Can HCD effectively respond to the challenges of the modern era, and can we get ahead of the unintended consequences of our design? They examine the inputs and processes of design, not just the output, to scrutinize the very essence of design practice.

A brief history of HCD

In the 1950s and 1960s, designers began exploring the application of scientific processes to design, aiming to transform it into a systematic problem-solving approach. Later in the 1960s, design thinkers in Scandinavia initiated the shift towards cooperative and participative design practices. Collaboration and engagement with diverse stakeholders became integral to design processes. Then, the 1970s and 1980s marked a shift in perspective, viewing design as a fundamentally distinct way of approaching problems. 

Moving into the late 1980s and 1990s, design thinking expanded to include user-centered design, and the idea of humans and technology becoming intertwined. Then the 2000s witnessed a surge in design thinking, where human-centered design started to make its mark.

Limitations of the “design process”

Dan and Stéphan discuss the “design squiggle”, a concept that portrays the messy and iterative nature of design, starting chaotically and gradually converging toward a solution. For 20 years, beginning in the early 90s, this was a popular way to explain how the design process feels. However, in the past 10 years or so, efforts to teach and pass down design processes have become common practice. Here enter concepts like the “double diamond” and “pattern problem”, which seek to be repeatable and process-driven. These neat processes, however, demand rigid adherence to specific design methods, which can ultimately stifle innovation. 

Issues with HCD and its evolution

The critique of such rigid design processes, which developed alongside HCD, highlights the need to acknowledge that humans are just one element in an intricate network of actors. By putting ourselves at the center of our design processes and efforts, we already limit our design. Design is just as much about the ecosystem surrounding any given problem as it is about the user. A limitation of HCD is that we humans are not actually at the center of anything except our own minds. So, how can we address this limitation?

Post-anthropocentric design starts to acknowledge that we are far less rational than we believe ourselves to be. It captures the idea that there are no clear divisions between ‘being human’ and everything else. This concept has become important as we adopt more and more technology into our lives, and we’re getting more enmeshed in it. 

Post-human design extends this further by removing ourselves from the center of design and empathizing with “things”, not just humans. This concept embraces the complexity of our world and emphasizes how we need to think about the problem just as much as we think about the solution. In other words, post-human design encourages us to “live” in our design problem(s) and consider multiple interventions.

Finally, Dan and Stéphan discuss the concept of Planetary design, which stresses that everything we create, and everything we do, has the possibility to impact everything else in the world. In fact, our designs do impact everything else, and we need to try and be aware of all possibilities.

Integrating new ways of thinking about design

To think beyond HCD and to foster innovation in design, we can begin by embracing emerging design practices and philosophies such as "life-centered design," "Society-centered design," and "Humanity-centered design." These emerging practices have toolsets that are readily available online and can be seamlessly integrated into your design approach, helping us to break away from traditional, often linear, methodologies. Or, taking a more proactive stance, we can craft our own unique design tools and frameworks. 

Why it matters 🎯

To illustrate how design processes can evolve to meet current and future challenges of our time, Dan and Stéphan present their concept of “Post human-centered design” (Post HCD). At its heart, it seeks to take what's great about HCD and build upon it, all while understanding its issues/limitations.

Dan and Stéphan put forward, as a starting point, some challenges for designers to consider as we move our practice to its next phase.

Suggested Post HCD principles:

  • Human to context: Moving from human-centered to a context-centred or context sensitive point of view.
  • Design Process to Design Behaviour: Not being beholden to design processes like the “double diamond”. Instead of thinking about designing for problems, we should design for behaviors instead. 
  • Problem-solutions to Interventions: Thinking more broadly about interventions in the problem space, rather than solutions to the problems
  • Linear to Dynamic: Understand ‘networks’ and complex systems.
  • Repeated to Reflexive: Challenging status quo processes and evolving with challenges that we’re trying to solve.

The talk wraps up by encouraging designers to incorporate some of this thinking into everyday practice. Some key takeaways are: 

  • Expand your web of context: Don’t just think about things having a center, think about networks.
  • Have empathy for “things”: Consider how you might then have empathy for all of those different things within that network, not just the human elements of the network.
  • Design practice is exploration and design exploration is our practice: Ensure that we're exploring both our practice as well as the design problem.
  • Make it different every time: Every time we design, try to make it different, don't just try and repeat the same loop over and over again.

Seeing is believing

Explore our tools and see how Optimal makes gathering insights simple, powerful, and impactful.