Design thinking is many things and well-described is one. But what does design thinking actually do (we don’t know)?
The summer of 2017 was a big one for design thinking.
Natasha Jen from Pentagram gave a talk at the 2017 Adobe 99u conference and came out and said in public something that many of my fellow designers have said in private to each other: design thinking is bullshit.
Her talk ignited considerable discussion and debate over the merits and problems with design thinking, including my own. Some have taken the position that Jen is right and we still need it. It had seemed that design thinking, an idea that had been gaining popularity in business and social sector circles was about to reach an inflection point.
It’s been two years since that talk and the giant spark of debate it caused and what’s happened since?
A whole lot of nothing.
Design thinking continues to grow in popularity with a sharp rise in the visibility of courses offered by leading universities and executive education programs that seek to ‘improve your innovation and design thinking’. Designers and consultancies — many of whom are represented on Medium — continue to promote design thinking as a means to achieving impact and innovation.
Is any of this true? Is this all just bullshit or something else? The answer is: we don’t know. We still don’t know.
After two years of debate, perhaps it’s time to ask a different question: What does design thinking actually do?
Going beyond simple description
Search for ‘what is design thinking’ and you’ll find thousands of articles that describe the concept, although few that are widely agreed upon as the ‘canons’. These descriptions range in detail but largely include terms like finding the right problem, building empathy for the user, generating lots of ideas, prototyping the best ideas, and deploying them into the world and then revising them.
What is almost always missing is the concept of evaluation — one of my biggest complaints. Without systematic evaluation of what you create how do you know a design is any good? Why would you put all that energy into creating something and then so little (if any) to see if what you created was doing what you thought it was and to what degree?
How do you know that the particular design thinking process you put in place accomplishes what you set out to do? (If the answer is: to create something new or think differently, that is a very low bar).
Description can be useful, but what we see in accounts of design thinking is painfully thin. Read an account of design thinking in almost any source (including the few academic ones (PDF)) and you’ll find lists of activities, but little detail on how each step was undertaken, what was generated from each step, the challenges and deviations made, and the overall outcome. In real terms, there is almost nothing that you can take away beyond generalities.
From thinking to skills
Another problem is that ‘design thinking’ is too often separated from design skill. An enormous debate has ensued over the idea that ‘everyone is a designer’ and that design thinking can instill or cultivate this in non-professional designers. Rather than dive too deep into that we need to return to the question: What does design thinking actually do?
If you are a designer, you might also be a poorly skilled designer. You might be well-versed in the language of design and the thinking approaches to fostering creative problem solving, but if you lack craft what kind of things will you create?
Bringing alternative perspectives to bear through design thinking is useful and can be enormously helpful at inspiring new thinking, new ways of seeing, and participation (and interest) in a topic. Engaging in something like co-design can engender trust in the public and greater participation in civic institutions. There are enormous potential benefits of doing design thinking, but the resulting products from design thinking may not be any better than using alternative approaches.
The problem is: we don’t know. (No one does systematic or comparative evaluation).
Another problem is that we might have hired design thinking to generate better solutions — asking the biggest question for innovators — not engage people. It’s possible to do full co-design, participatory design, and engaging design thinking and achieve better, useful, more impactful products than before, but it isn’t a guarantee. Better thinking doesn’t mean better skills.
From skills to culture
Jon Kolko, one of the clearest thinkers in design (in my opinion), wrote a piece in Harvard Business Review about design thinking coming of age. In it, he advocates the need to create design cultures within organizations as the next stage of development for what he loosely refers to as ‘design thinking’. His argument is that, without a culture of design to support design thinking in an organization, most of what we generate through the process won’t have an impact. It simply won’t stick.
I agree fully.
Having engaged clients for more than 15 years in hands-on collaborative design thinking I can attest to the problem of ‘failure to launch’. Without a culture that embraces what design thinking aims to foster — empathy for others, creative problem framing and idea generation, prototyping, and implementation — design thinking is usually reduced to a creative planning exercise and nothing more.
Without skilled designers as part of the team, support from organization leadership, and attention paid to designing for the system in which the problem exists, we are left with an idea that has nothing to attach to.
Answering the question: What does design thinking do?
To answer the question of what design thinking actually does requires some form of evaluation. Evaluation is simply the systematic means of understanding an activity through evidence gained by observation and measurement. Qualitative, quantitative, and administrative data can all contribute to our understanding, but what’s important is that it is systematic.
A group of evaluation practitioners and scholars in Canada define evaluation as:
Evaluation is the systematic assessment of the design, implementation or results of an initiative for the purposes of learning or decision-making.
Evaluation can be flexible, dynamic, and responsive while still being systematic. Developmental evaluation is one approach that is suited for innovation. Evaluation can also be design-driven in its own right, by integrating data collection and sensemaking.
To illustrate, if you’re a chef you probably appreciate good, systematic evaluation. If you rely solely on those who complain or say “I loved that dish” for your feedback, you’re losing out on opportunities to learn. How many people ordered the dish? What other offerings were there? What experience have they had with the dish? Are these extremes or the norms?
What you ask, who you ask, when you ask it, how you ask it, and how you interpret and use the findings are all part of what evaluation asks and focuses on.
The many positive reviews of design thinking from consultancy clients can be viewed the same way through evaluation. If you’ve never been a part of developing a solution to a problem in your organization before, you’re more likely to view it as a positive. If you’ve made something, chances are you will defend and advocate for it when compared to something you’ve had no say in.
What a systematic approach does is account for the perspectives that people bring and make sense of what it means for the innovation development process. A systematic approach to design thinking looks at not only what is done, but asks how it is done (the same way? different? what modifications were made (and why?)?) and what came from these decisions (including unintended consequences)?
Until we ask what does design thinking actually do we will continue to see more investment of time, space, energy, and focus placed on something that has little evidence to back it up at the potential cost of something else.
It’s time to ask better questions. Only then can we really tell whether we’re dealing with bullshit or not.