How Design Thinking, Lean, and Agile Work Together

Article originally published at on September 20, 2017.

The ideas of Agile are great. It’s the way it has been codified into rituals and certifications, and rolled out mindlessly that misses the point.

When people talk about Lean, the conversation often ends at process optimization, waste, and quality, and misses so much of what the Lean mindset offers.

Design Thinking is held high as the new magic trick of design facilitators.

That’s three mindsets corrupted by the unthinking masses, who’ve grabbed onto a tantilising promise of something better, and followed the steps without really thinking it through. People have a real need to change, but they get stuck following rules or process without really understanding why.

  Figure 1. Three mindsets of product development

Figure 1. Three mindsets of product development

Design Thinking is how we explore and solve problems; Lean is our framework for testing our beliefs and learning our way to the right outcomes; and Agile is how we adapt to changing conditions with software.

Design Thinking is about ability and learning. Carissa Carter, head of teaching at Stanford Design School, brilliantly describes some of the abilities that make designers great. Abilities like dealing with ambiguity, empathetic learning, synthesis, and experimentation, among others. A designer’s ability to make meaning, frame a problem, and explore potential solutions are key. Donald Norman, author of The Design of Everyday Things, describes a designer’s discontent with the first idea. Ask yourself, when was the last time that your first idea was your best idea? Meaning and new ideas emerge when we explore things. Design Thinking is simply how we explore those problems and solutions. Everyone designs, whether it’s conscious or not. If you’re solving a problem, you’re designing a solution. Design Thinking is a mindset that helps us do it better.

Lean started out as a response to scientific management practices in manufacturing. Organisations sought efficiency through process, rules, and procedures and management was mostly about control. But in modern business, control is a falsehood. Things are too complex, too unpredictable, and too dynamic to be controlled. Lean offers a different mindset for managing any system of work. It’s fundamentally about exploring uncertainty, making decisions by experimenting and learning, and empowering people who are closest to the work to decide how best to achieve desired outcomes. Lean says be adaptive, not predictive.

Agile is related to Lean. The differences are mostly about what these mindsets are applied to, and how. In conditions of high uncertainty, Agile offers ways to build software that is dynamic and can adapt to change. This isn’t just about pivoting. It’s also about scaling and evolving solutions over time. If we accept that today’s solution will be different from tomorrow’s, then we should focus on meeting our immediate needs in a way that doesn’t constrain our ability to respond when things change later. The heart of Agile is adapting gracefully to changing needs with software.

  Figure 2. Comparing and contrasting Lean and Agile

Figure 2. Comparing and contrasting Lean and Agile

The real benefit comes when we bring all three mindsets together. Too often, the question is “lean or agile?”. The answer is “and”, not “or”: it’s Design Thinking, Lean, and Agile. That’s easy to say, but how do we do it, and what does it look like in practice? Here’s some lessons learned from applying Design Thinking, Lean, and Agile in the wild.

Purpose, Alignment, and Autonomy

“Be stubborn on the vision, but flexible on the details.”
 — Jeff Bezos

Building a product is a lot like a combat mission. A team of skilled peopleoperate in conditions of high uncertainty; a commander sets clear outcomeswith some guiding principles; but we expect the unexpected; and, we’re trained to take best action, responding to new information as the situation unfolds.

All of that takes discipline. And practice.

In military operations, it’s called disciplined initiative, and soldiers train so they can practise the movements of combat. In Mike Rother’s Improvement Kata, it’s called deliberate practice, and it’s how we practise the movements of scientific thinking. This is how product teams can align to purpose, explore uncertainty, and learn their way to achieving desired outcomes.

Pro tip: Try visualising the whole end-to-end process, from aspirations and hypotheses to design experiments and feedback on a big product wall, so that the whole team can play along together.

  Figure 3. A product wall

Figure 3. A product wall

Measure Things That Matter

“If a measurement matters at all, it is because it must have some conceivable effect on decisions and behavior.”
 — Douglas W. Hubbard

How will you measure outcomes? When will you know you’ve achieved it?Will your metrics help you make a decision?

We all know that vanity metrics — like total page views, or total new customers — are pointless. But knowing what not to measure doesn’t make measuring the right things any easier. Even with the right motives, there are lots of ways we get it wrong. Suppose you’re operating an online store selling tens of thousands of unique items to all kinds of buyers. Making it easier for them to find what they’re looking for is one of your goals. Now try breaking that goal down into metrics that help you know if you’re on the right track.

  Figure 4. The good and bad of goal-based measurements

Figure 4. The good and bad of goal-based measurements

Pro tip: Structure your metrics around future decisions you want to make. Only measure things that indicate progress toward your goal. Hypothesis Driven Development offers a way to frame outcomes, beliefs, and metrics in a simple, repeatable format. It gives some structure for finding the right metrics, and makes it easy to communicate to others.

Make Decisions Based on Learning

“Don’t look for facts or answers — look for better questions. It’s the questions we ask, and the meaning we explore, that will generate the insights most useful to strategy.”
 — Dr Jason Fox

Why do we learn? To make better decisions. Many solutions fail because they solve no meaningful problem, and we tend to fall in love with our ideas and let our biases get the better of us. Even when we try to de-risk our decisions by testing our ideas and running experiments, we don’t always get it right. One trap is to conflate a good prototype test result with strong affinity for the problem, or customer demand for the solution. Each of those — problem, solution, demand — are separate concerns, and they require different learning approaches.

We don’t need to be scientists to learn the right things, and, insight doesn’t have to belong only to the research team. If we think through our approach, it can help us make better decisions. We can start by:

  1. Defining our beliefs and assumptions (so that they can be tested)
  2. Deciding the most important thing to learn
  3. Designing experiments that will deliver learning

Pro tip: The Problem-assumption Model is an easy way to get started. It’s a way of helping us ask: What’s the problem? How might we solve it? What assumptions have we made? How will we test our assumptions?

  Figure 5. The Problem-Assumption Model, created by    Jonny Schneider    and    Barry O’Reilly

Figure 5. The Problem-Assumption Model, created by Jonny Schneider and Barry O’Reilly

Many Mindsets, one Team

Most important of all, it’s about working together and achieving together. Learning is a team sport, and collaboration is key if we’re going to find our way to the place we want to be. There is no one correct way, nor is one single mindset enough. But all together, elements of each mindset help us to find our way forward.


  Figure 6. How the three mindsets overlap

Figure 6. How the three mindsets overlap

Instead of focusing on applying a process, teams ought to challenge how they think and try new things, embrace the things that work, and learn from the things that don’t. This right way will be different for each team in their own specific context. Success is about how teams develop new abilitylearn by doing, and adapt to what is learned.

Pro-tip: Try Jeff Patton’s Dual Track Development. It describes how teams can bring product discovery and product development closer together into one collaboration.

Want to read more? Get the free ebook from O’Reilly publishing, then rate it on goodreads.

Finding the fastest path to feedback

Asking better questions and learning to facilitate simple customer interviews is one of the fastest ways to bring customer-centred design into the boardroom. Customer learning is a powerful for developing strategy, and it’s not just their wants and needs. Learning their motivation, attitudes and behaviour helps to define products and services that have a chance of succeeding. Yet, many leaders choose to forego early opportunities to learn from customers because...

“There’s no time for that” 
“We already know our customers”
“The marketing and customer insights team always do all of the research”
“We don’t have the right skills to do it properly”

While sometimes those are reasonable concerns, too often, it means customer insight is delayed so much that it becomes impotent. The purpose of research is to learn something to inform a decision about what to do next. It gives us confidence to continue, or a reason to explore new lines of enquiry. It’s how we find a path, and shape our beliefs about how we’re going to win. Most importantly, it’s helps decide where to invest. Leaders need research that helps them learn fast, make decisions, and keep moving. 

The purpose of research is to learn something to inform a decision about what to do next.

Expert researchers have lots of ways to do the best research. We can be confident in their analysis, and those neat summary reports are a terrific reference. The trouble is that thoroughness takes precious time - a scarce resource for most teams. Insight that arrive weeks or months after the question is often irrelevant. Not because it’s wrong, but because the product team has already moved on to new questions to find answers to. Those questions need new research! And so it goes ad infinitum. 

You can break the cycle by doing your own research in hours not weeks. Basic design research technique is a powerful tool that is easily learned. With some simple instruction and a little guidance, product teams can 10x their customer insight capability by connecting decision makers directly with customers. 

Here’s how to get started.

Find better questions

Before asking questions, we need an idea of what answers we’re looking for. We need to design questions that push the conversation in a useful direction. Otherwise, we may have a lovely chat, but mightn’t learn anything that helps determine if we’re on the right track. 

Everyone loves a quadrant model. They work because they’re simple. This one helps break down our beliefs and identify the underlying assumptions in our thinking. Those assumptions are then translated into questions that become the foundation of enquiry during customer interviews.

You’re a digital director at a personal investment company. Your competitors all have robo-advisory services, and the executive leadership team feels they ought to have one too. There are three months and $2M to make it so. 

The solution is a robo-advisory service. We think it solves the customer job of managing a personal investment portfolio. What assumptions have we made? Traditional advisory services are out of reach. People will trust automated advice enough to make financial decisions. They’re willing to pay something for the service. Now we can design questions to find out if that’s true, and understand why or why not.

This simple approach is powerful because it’s flexible. You can start from anywhere - problems, solutions, assumptions or questions - and elaborate your thinking. It’s common start with a solution, and elaborate on the problem it solves. It’s easy to then identify implied assumptions, and these lead us to the questions that will help us learn from customers. As adoption for Design Thinking continues, more and more start with the problem, and then elaborate solutions, assumptions and questions.

With questions to answer, it’s time to talk to customers.

Do simple interviews

Talking with people is a great way to find out whether your ideas are worthwhile or hypotheses hold water. A professional researcher might do better interviews, but most people are capable of research that is good enough. You don’t need a certification to be an empathetic listener. Remember, it only needs to be good enough to inform a decision about what to do next. A lot of the time, the next thing is to explore a new question. It’s a quest for learning, not a quest for certainty, so the consequences of being ‘wrong’ are relatively low.

Interviewing is a quest for learning, not a quest for certainty, so the consequences of being ‘wrong’ are relatively low.

Just have a go, you’ll be surprised at how quickly you become competent. Follow these tips to get started.

1. Ask open, probing questions

Keep your questions open, not closed. You’re goal is to keep the participant talking. Meaning and insight is most often found in the stories that surround someone’s experience, not just the final outcome. 

You’re working with Transport for London to improve the commuter experience. Knowing that Alessandra doesn’t like catching the train is a piece data. You can collect it by asking “do you like catching the train?”. That information is meaningless. 

There’s more to learn by asking open questions, like “tell me about your experiences in travelling on trains”. You’ll likely get a more meaningful response. Alessandra might tell you about a seedy passenger with wandering eyes. It was an uncomfortable journey, and when this creeper alighted at Alessandra’s home station, she hesitated, feeling vulnerable, and decided to stay on the train for safety. This information is now much more meaningful, and might contribute to solutions that help travellers stay safe while commuting.

If you’re stuck for a question, or need a few seconds to gather your thoughts, try one of these:

Like this... Not like this...
Ask open questions
  • Tell me about a time when…
  • Talk to me about what it was like to…
  • Tell me about how that happened.
  • Are you happy with how that went?
  • Did you get everything that that you needed?
  • Is there anything else that’s noteworthy?
Ask probing questions
(instead of affirmative questions)
  • What do you think caused you to do that?
  • What were you thinking when...?
  • how did it make you feel when…?
  • Was it the cancelled transaction that caused you to get frustrated?
  • And this made you feel angry, didn’t it?
  • You thought that was rubbish, didn’t you?

2. Use description to set the scene

Using believable scenarios helps to prime people to think more deeply about how they might respond in a given situation. Its detail that might be skipped over otherwise. Even though we can’t rely on how people say they will behave, describing a scenario gives a chance that we’ll connect, and learn what people really think. 

Don’t reveal an idea and ask for opinion. That’s conjecture. It’s much better to set a relatable scenario; describe a solution; observe the reaction; and ask further probing questions to better understand it. This works well in many situations, but is especially useful when combined with sketches, prototypes or other visual props that people can interact with. 

3. Use props to observe reactions

Using simplistic sketches is a fast and effective way to start testing assumptions. You’ve set the scenario, now show a solution as a sketch or rough wireframe and observe what happens next. So much can be learned by watching, not just listening. 

So much can be learned by watching, not just listening.

You’re testing a new proposition in a booking service that uses economies of scale to acquire new paying customers by offering group discounts. Success relies in-part on a belief that customers will tolerate a few extra steps and some social sharing to qualify for a small discount.

A picture is worth a thousand words. 

GroupBooking _v1.JPG

What do you understand this to be? Tell me what sense you make of it? What would you do now? What might you do if…? What’s your expectation about what happens next? And so on. 

The concept is now tangible and relatable. These visual props give people something more specific to share their thoughts about.

In summary

Talking with customers isn’t just for research experts.  Embrace the entrepreneurial spirit, get out of the building, and find out straight from the people whether you’re thinking really adds up. Keep it simple and get involved. You’ll find that what you learn in one day talking with customers is more than you thought possible. Don’t do it once, do it always. And use what you learn to make better decisions faster. You’ve nothing to lose, and so much to gain.

They’ll be opportunities to do more thorough research later. As strategy evolves from initial learning, stakes are raised, investment increases, and experiments become more elaborate. That’s when to bring in the experts. They’ll have ways to increase confidence in any findings - often by using several methods to triangulate the results. Early on though, a general proximity to the truth is more important than precision.



Digital product development explained in 5 minutes

Here's 5 minute animated walkthrough of the why, what and how of digital product development using the Double Diamond. The high-level approach is very intentional and deliberate. How it's executed in practice is infinitely variable to suit specific needs. It's a way of thinking, not a process!

Product Strategy Constellations

I’m always looking for workshop methods that work. I have some favourites, but workshop facilitation is full of uncertainty. Nearly always, modification is needed, and most planned activities work better when we’re willing to stay loose. This time, the situation called for something completely new.

We needed something to cut through the noise. More than just a roadmap, we wanted to see relationships between parts in strategy. These parts included more than just software products. Operations, technology platforms, partnerships, business acquisitions, and software products all needed a place on the map. We wanted to create a visual model to show a single view. One reference for many teams.

All things considered, our visual model needs to:

  • Map relationships between all initiatives in a programme
  • Serve as a single view of strategy for many teams
  • Show both tactical and strategic activity in the same view
  • Show software initiatives alongside other kinds of work
  • Be easy to understand and update

We dreamed up the Strategy Constellation Map

 A Strategy Constellation Map shows the relationship between elements of a product strategy and an overarching programme of initiatives.

A Strategy Constellation Map shows the relationship between elements of a product strategy and an overarching programme of initiatives.

 Components of the Strategy Map

Components of the Strategy Map

Let's take a look at each part of the map in more detail.

Business Objectives

The map above shows business objectives representing a revenue funnel. The first two are about attracting and activating new customers. The third objective is about successful fulfilment of service - which in this case correlates with revenue generation. Your business objectives are probably different, but you get the idea.

When organised this way, it's easy to see all the things that contribute to an objective in one place. Scanning across the map helps us define ‘thin slices’ of value. That is, small chunks of work that deliver tangible benefit when done together. This is the work you want to do first. 

If you'd rather focus on just one objective, the map still helps. All relevant entities are shown together for easier prioritisation.

Product Roadmap

Here, product team can see how the work they do maps to the bigger-picture strategy (above the waterline). It’s an exploded view showing the roadmap for each item in the programme. The roadmap shows release horizons. Each horizon includes prioritised release candidates.

The product roadmap shown above has three release horizons:

  1. Now: Minimum functionality for basic provision of service. Some may call this the MVP.
  2. Next: Optimisation for a better customer experience.
  3. Later: Items for further consideration at a later time

Programme Initiatives

Above the waterline, all initiatives that make up programme of work are shown. These could be other products or services, changes to business processes, technology simplification, partnerships, new business models, and so on. Detail is less important. What we want to see is how these packages of work correlate with business objectives.

Visibility is the main goal here. Showing relationships makes it easier for teams to collaborate. And, it also reduced duplication. For example, a team working to optimise customer activations can easily see adjacent or related activity in the visual model.

 Constellations show the relationship between entities

Constellations show the relationship between entities


Bringing it all together

Constellation maps connect product-level strategy to a programme-level view. Mapping theses relationships together with business objectives, we achieve a single view. Product teams can ‘look up’, and see the broader context of what they’re working towards. It’s just as useful for strategy teams to ‘look down’ for a snapshot of how product-level initiatives contribute to the bigger picture.


These are great for product teams, because one-to-many relationships are clearly shown. One product, mapping to many entities in a big strategy. However, many-to-many relationships are not so easy. Things above the waterline (programme initiatives) may have a relationship with many entities in several product roadmaps. That's hard to show in a single view.

Constellation Maps work best when there are a handful of business objectives. Let’s say six or fewer. Any more than that, and mapping between entities becomes a bit silly, visually.

They're strongest as a visual model. For ongoing management of scope and dependencies, these maps would surely become cumbersome over time. Especially if there are many product roadmaps within one programme of work.

If this a useful visual model for you, I'd love to hear your thoughts in the comments!