I re-read my “What Nukes?” post and realized I was really rambling. I want to reiterate a key point more clearly because I think it is important.
In the “Bad Apple” theory there is an implied assumption that the cause of an accident or other problem was one person who, at that moment in time, was not following the documented rules or procedures.
Except in the most egregious cases, such as deliberate misconduct, that is likely not the case. Most organizations have a set of “norms” that operate at some level of violation of the written or established procedures. The reasons for this are many, but usually it is because good people are doing the best they can, in the conditions they are given, to get the job done.
Failure to follow the rules does not result in an accident or incident.
Have you every run a red light or a stop sign? It happens thousands of times every day. It almost never results in an accident. Only when other contributing conditions are ripe will an accident result. Running a stop sign AND a car coming through the intersection.
The same goes for quality checks, and the more reliable an “almost 100%” process becomes, the more vulnerable you are. If a defect is only rarely produced, it is unlikely that any kind of human-based inspection will catch it. The faster the work cycle, the more this is true. The mind numbs, it is impossible to always pay attention to the detail, and the mind sees what it expects. “Failure to pay attention” is never an adequate root cause. It is blaming an unlucky Team Member for an omission that everyone makes every day just going through life. It is just, in this case, “there was a car coming through the intersection.” It is bad luck. It is being blamed for red beads in Deming’s paddle experiment.
So attaching the failure of an individual, while it is easy, avoids the core issue:
People’s failure in critical processes is a SYSTEM PROBLEM. You must investigate from the viewpoint of the person at the pointy end. What did he see? What did he perceive? What did he believe was happening and why was that belief reasonable given his interpretation of the circumstances at the time.
The post about “sticky visual controls” got to this. Your mistake-alerts or problem signals must penetrate conciousness and demand attention if they do not actually shut down the process.
What Nukes – a little more clear. Great talking points. It is amazing how often people, including some “lean leaders” miss this point while problem solving.