In this TED video, Simon Sinek summarizes a key thing that differentiates an idea that catches on vs. one that plops.
This is relevant to us at a couple of levels.
First, as Sinek points out, truly great companies succeed because they stand for something higher. They have a “why” that drives what they do and how they do it.
Companies that cannot articulate what they stand for are at a competitive disadvantage vs. those who can.
But these concepts are also critical to those of us who are trying to sell the concept of changing the way our own organizations run. Watch the video – then continue below.
In spite of what is taught in the business schools, business decisions are rarely made based on financial analysis and rates of return. Those things are carefully constructed, but often after the fact to justify what someone wants to do already (i.e. has already decided to do).
When we try to sell our changes, we often try to address the “what’s in it for me?” but still continue to try to make logical “what” type arguments.
That doesn’t work. It has to feel right.
Think about your own organization. When or where do things feel like they are going really well, what is aligning? What values are being realized? How do those moments differ from times or places where things are not going so well?
What makes people say “OH Yeah!” ?
As you try to make the case for “lean” or continuous improvement in your organization, are you crystal clear what you believe in? Can you articulate it? Do others in the company want to believe in the same things?
This question comes up now and again, was recently posed on the LEI forums by someone looking for help with a job description.
I extrapolated from his question that he was looking to the job description as a line of defense against dilution of the facilitator’s focus and effort by projects that might not be going in the appropriate direction.
In effect, this is putting the lean facilitator in the role of a weakened zampolit with the role of educating the “correct view” and challenging decisions that run counter to it. Except that more often he has to sell the “correct view” rather than impose it.
The fact that the question is being asked at all indicates that the organization has not really thought through what their operational vision is. How will the company work, what are the responsibilities and roles of the leaders?
What are the leaders’ job descriptions in this new world?
Those job descriptions become a target condition for each of them.
What is the gap?
If there are gaps in skills and knowledge, then we need countermeasures.
At this point, the role and responsibility of a lean facilitator might begin to emerge as one of those countermeasures. Don’t have the expertise? Import it.
What doesn’t work, though, is to use the lean facilitator to substitute for the leader’s full and direct participation in the process of improvement. And no job description, no matter how carefully crafted, can fix that.
There are a lot of variations on a theme where someone asks an Internet forum how to quantify or justify the benefits of implementing a continuous improvement program.
If you think about it, though, this is really interesting question.
What are the benefits of NOT having continuous improvement? Why would managers deliberately decide not to have a learning organization, not to have continuous improvement, not to fully engage the intelligence of their workers?
Why would managers deliberately decide not to improve safety, quality, delivery, lead times?
What if we asked the question that way?
What is the benefit of not having these things?
If that question is subsequently dismissed as stupid (which I hope it would be), then the question is no longer whether they should be pursued, but how.
I have probably written around this question in the past, but it comes up often enough that I wanted to address is specifically.
One of the challenges facing the lean practitioner is the “What can you do for me?” boss (or client).
This manager wants to know the expected ROI and outcome of your proposal before he agrees to make the investment in improvement.
This style of proposal-evaluation-decision management is exactly what is taught in every business school in the world. The process of management is a process of deciding between alternative courses of action, including no action at all.
This approach actually creates “no action” as the baseline. Any change is going to disrupt the status-quo and incur some kind of cost. Therefore, the thinking goes, the change better be worth it. “Am I going to get enough back?”
“What can you do for me?” implies a general sense of satisfaction with the status quo.
The lean thinker reverses this model. The status quo is a stagnant and dangerous place.
There is always an improved state that we are striving for.
Rather than measuring progress from the current state, we are measuring remaining gap to the target, and we must close that gap.
There are problems in the way.
Proposed solutions to those problems are evaluated on (among many other things) cost to see if the solution is an acceptable one, or if more work is required to find a better solution. But maintaining the status quo is not on the table. The decision has already been made to advance the capability of the organization. The only decision is around how to do it, not whether to do it.
So when a legacy GM style manager asks “What can you do for me?” the question must be changed to “What are you striving to achieve?”
Challenging the complacency of the status quo is our biggest hurdle.
5S has become an (almost) unchallenged starting point for converting to lean production. Although the basics are quite simple, it is often a difficult and challenging process.
After the initial push to sort stuff out and organize what remains, sustaining oftenusually almost always becomes an issue.
Again, because of early legacy, the most common response is what I call the 5×5 audit. This is a 5×5 grid, assigning 5 points across each of the “S” categories. It carries an assumption that managers will strive for a high audit score, and thus, work to sustain and even improve the level of organization.
Just today I overheard a manager trying to make the case that an audit score in his area ought to be higher. It was obvious that the objective, at least in the mind of the manager, was the audit score rather than solving problems.
The target condition had become abstract, and 5S had become a “program” with no evident or obvious purpose other than the general goodness that we talk about upon its introduction.
If the audit score is not the most important thing, then why do we emphasize it so much? What is our fascination with assigning points to results vs. looking at the actual results we are striving to achieve?
To digress a bit, many will say at this point that this is an example of too much emphasis on audits. And I agree. But this is more common than not, so I think of this as an instance of a general problem rather than a one-off exception.
Our target condition is a stable process with reduced, more consistent cycle times as less time is spent hunting for things. Though we may see a correlation between 5S audit scores and stability, it is all to easy to focus on the score and forget the reason.
Shop floor people tend to be intelligent and pragmatic types. They do not deal in a world of abstraction. While the correlation might make sense to a manager used to dealing in an abstract world of measurements and financials, that is often not the case where the work is actually done.
The challenge is: How do we make this pragmatic so it makes sense to pragmatic people?
Let’s start by returning the focus to pragmatic problems. Instead of citing general stories where people waste time looking for things so we can present a general solution of 5S, let’s keep the focus on specifics.
What if (as a purely untried hypothetical), we asked a team member to put a simple tick mark //// on a white board when he has to stop and hunt for something, or even dig through a pile to get something he knows is in there? If you multiply that simple exercise times all of the people in the work area, add up the tick marks every day, and then track the trend, you may just get more valuable information than you would with the 5×5 audit done once a month.
What if we actually track stability and cycle times. Isn’t this avoiding these wastes the case we make for 5S in the first place? So perhaps we should track actual results to see if out understanding is correct, or if it has gaps (which it does, always).
What if we taught area leaders to see instability, off-task motions, and to see those things as problems. Let them understand what workplace dis-organization causes.
How about tracking individual problems solved rather than a general class of blanket countermeasure?
How many sources of work instability did we address today? I’d like to see what you learned in the process. What sources of instability did you uncover as you fixed those? What is your plan to deal with them? Great!
No problems today? OK – let’s watch and see if we missed anything. OH! What happened there? Why did we miss that before? Could we have spotted that problem sooner? What do we need to change so we can see it, and fix it, before it is an issue with the work?
These are all questions that naturally follow a thorough understanding of what 5S actually means.
But we have had 5S freeze dried and vacuum packed for easy distribution and consumption. At some point along the way, we seem to have forgotten its organic state.
“Are we trying to force compliance or develop leaders?”
The answer to this question is going to set your direction, and (in my opinion) ultimately your success.
It comes down to your strategy for “change.”
When people talk about “change” they are usually talking about “changing the culture.” Digging down another level, “changing the culture” really means altering the methods, norms and rituals that people (including leaders) use to interact with one another.
In a “traditional” organization, top level leaders seek reports and metrics. Based on those reports and metrics, they ask questions, and issue guidance and direction.
The reports and metrics tend to fall into two categories.
Financial metrics that reflect the health of the business.
Indicators of “progress” toward some kind of objective or goal – like “are they doing lean?”
Floating that out there, I want to ask a couple of key questions around purpose.
There are two fundamental approaches to “change” within the organization.
You can work to drive compliance; or you can work to develop your leaders.
Both approaches are going to drive changes in behavior.
What are the tools of driving compliance? What assumptions do those tools make about how people are motivated and what they respond to?
What are the tools of leader development? What assumptions do those tools make about how people are motivated and what they respond to?
I have posted a few times about the “management by measurement” culture and how destructive it can be. This TED video by Daniel Pink adds some color to the conversation.
Simply put, while traditional “incentives” tend to work well when the task is rote and the solution is well understood, applying those same incentives to situations where creativity is required will reduce people’s performance.
We saw this in Ted Wujec’s Marshmallow Challenge video as well, where an incentive reduced the success rate of the teams to zero.
This time of year companies are typically reviewing their performance and setting goals and targets for next year.
It is important to keep in mind that there is overwhelming evidence that tying bonuses to key performance indicators is the a reliable way to reduce the performance of the company.
Yesterday, Kris left great comment with a compelling link to a TED presentation by Tom Wujec, a fellow at Autodesk.
Back in June, I commented on Steve Spear’s article “Why C-Level Executives Don’t Engage in Lean Initiatives.” In that commentary, Spear contends that business leaders are simply not taught the skills and mindset that drives continuous improvement in an organization. They are taught to decide rather than how to experiment and learn. Indeed, they are taught to analyze and minimize risk to arrive at the one best solution.
Tom Wujec observes exactly the same thing. As various groups are trying to build the tallest structure to support their marshmallow, they consistently get different results:
So there are a number of people who have a lot more “uh-oh” moments than others, and among the worst are recent graduates of business school.
[…]
And of course there are teams that have a lot more “ta-da” structures, and, among the best, are recent graduates of kindergarten.[…] And it’s pretty amazing.
[…] not only do they produce the tallest structures,but they’re the most interesting structures of them all.
What is really interesting (to me) are the skills and mindsets that are behind each of these groups’ performance.
First, the architects and engineers. Of course they build the tallest structures. That is their profession. They know how to do this, they have done it many thousands of times in their careers. They have practiced. Their success is not because they are discovering anything, rather, they are applying what they already know.
In your kaizen efforts, if you already know the solution, then just implement it! You are an architect or engineer.
BUT in more cases than we care to admit, we actually do not know the solution. We only know our opinion about what the solution should be. So, eliminating the architects and engineers – the people who already know the solution – we are left with populations of people who do not know the solution to the problem already. This means they can’t just decide and execute, they have to figure out the solution.
But decide and execute is what they are trained to do. So the CEOs and business school graduates take a single iteration. They make a plan, execute it, and fully expect it to work. They actually test the design as the last step, just as the deadline runs out.
The little kids, though, don’t do that.
First, they keep their eye on the target objective from the very beginning.
Think about the difference between these two problem statements:
Build the tallest tower you can, and put a marshmallow on top.
and
Support the marshmallow as far off the table as you can.
In the first statement, you start with the tower – as the adults do. They are focused on the solution, the countermeasure.
But the kids start with the marshmallow. The objective is to hold the marshmallow off the table. So get it off the table as quick as you can, and try to hold it up there. See the difference?
More importantly, though, is that the kids know they do not know what the answer is. So they try something fast. And fail. And try something else. And fail. Or maybe they don’t fail… then they try something better, moving from a working solution and trying to improve it. And step by step they learn how to design a tower that will solve the problem.
Why? Simply because, at that age, we adults have not yet taught the kids that they are supposed to know, and that they should be ashamed if they do not. Kids learn that later.
Where the adults are focused on finding the right answer, the kids are focused on holding up a marshmallow.
Where the adults are trying to show how smart they are, the kids are working hard to learn something they do not know.
Third – look what happened when Wujac raised the stakes and attached a “big bonus” to winning?
The success rate went to zero. Why? He introduced intramural competition and people were now trying to build the best tower in one try rather than one which simply solved the problem.
Now – in the end, who has advanced their learning the most?
The teams that make one big attempt that either works, or doesn’t work?
Or the team that makes a dozen attempts that work, or don’t work?
When we set up kaizen events, how do we organize them?
One big attempt, or dozens of small ones?
Which one is more conducive to learning? Answer: Which one has more opportunities for failure?
Keep your eye on the marshmallow – your target objective.
Last thought… If you think you know, you likely don’t. Learning comes from consciously applied ignorance.
Edited 2 August 2016 to fix dead link. Thanks Craig.
Now and then, usually when coaching or teaching someone, I get what I think is a flash of insight. Then I realize that, no, there is nothing new here, it is just a different way to say the same thing. Still, sometimes finding a different way of expressing a concept helps people grasp it, so here is one I jotted down while I was working with a plant.
One of the myths of “lean production” is the idea that, at some point, you achieve stability in all of your processes.
Nothing could be further from the truth.
Failure is a normal condition.
The question is not, whether or not you have process breakdowns.
The question is how you respond to them. Actually, a more fundamental question is whether you even recognize “process failure” that doesn’t knock you over. Our reflex is to try to build failure modes that allow things to continue without intervention. In other words, we inject the process with Novocain so we don’t feel the pain. That doesn’t stop us from hitting our thumb with the hammer, it just doesn’t hurt so much.
But think about it a different way.
“What failed today?”
Followed by
“How do we fix that?”
Now you are on the continuous improvement journey. You are using the inevitable process failure as a valuable source of information, because it tells you something you didn’t know.
There is a huge, well established, body of theory in psychology and neuroscience that says that we learn best when three things happen:
We have predicted, or at least visualized, some kind of result.
We try to achieve that result, and are surprised by something unexpected.
We struggle to understand what it is that we didn’t know.
In other words, when we (as humans) are confronted with an unexpected event, we are flooded with an emotional response that we would rather avoid. In simple terms, this translates to “we like to be right.” The easiest way to “be right” is to anticipate nothing.
This takes a lot of forms, usually sounding like excuses that explain why stability is impossible, so why bother trying?
Why indeed? Simple – if you ever want to get out of perpetual chaos, you first have to embrace the idea that you must try, and fail, before you even know what the real issues are.
The Chicago conference turned out to be very Six Sigma centric – in spite of having Mike Rother as a keynote. But that is history.
I want to reflect a bit about this podcast. I invite you to listen yourself- it is an interesting perspective from a senior executive who discusses her own learning and discovery. I will warn you that you may have to “register” on the web site – though you can uncheck the “send marketing stuff” box. I will also say that the interview’s sound is pretty bad, so it is hard to hear the questions, but I was able to reconstruct most of it from context.
What is interesting, to me a least, is that the methods and experiences are pretty standard stuff – common to nearly all organization undertaking this kind of transformation.
A summary of the notes I took:
They have to deliver hard budget level savings on the order of 5% a year for the next several years. That is new to them as a government organization.
They started out with an education campaign across the organization.
Initial efforts were on increasing capacity, but those efforts didn’t result in budget savings. In one case, costs actually increased. They don’t need more capacity, they need to deliver the same with less.
They have identified process streams (value streams), and run “rapid improvement events.”
Senior people have been on benchmarking or study trips to other organizations, both within and outside of the health care arena.
They are struggling to sustain the momentum after the few months after an “event” and seeing the “standard” erode a bit – interpreting this as needing to increase accountability and saying “This is how we do things here.”
“Sustaining, getting accountability at the lowest level is the biggest challenge.”
In addition, now that they are under budget pressure, they are starting to look at how to link their improvements to the bottom line, but there isn’t a standardized way to do this.
They believe they are at a “tipping point” now.
There is more, having do to with Ms. Doherty’s personal journey and learning, and knowledge sharing across organizations who are working on the same things, but the key points I want to address are above.
Please don’t think that this interview is as cold as I have depicted it. It is about 20 minutes long, and Ms. Doherty is very open and candid about what is working and what is not. It is not a “rah-rah see what we have done?” session.
As I listened, I was intently trying to parse and pull out a few key points. I would have really liked it if these kinds of questions had been asked.
What is their overall long term vision? Other than meeting budgetary pressure and “radically reviewing” processes, and “transformation.” What is the “true north” or the guide point on the horizon you are steering for?
What is the leadership doing to set focus the improvement effort on the things that are important to the organization? What does the process have to look like to deliver the same level and quality of care at 5% lower costs? What kinds of things are, today, in the way of doing that? Which of those problems are you focused on right now? How is that going? What are you learning?
What did they try that didn’t work, and what did they learn from that experience?
When you say “local accountability” to prevent process erosion, what would that look like? What are you learning about the process when it begins to erode?
The “tipping point” is a great analogy. What behaviors are you looking for to tell you that a fundamental shift is taking place?
As you listen, see if you can parse out what NHS Bolton is actually doing.
Is their approach going to sustain, or are they about to hit the “lean plateau?”
What would the “tipping point” look like to you in this organization?
What advice would you give them, based on what you hear in this interview?