Lean, Bias, Impartiality and Justness

I considered myself to be appropriately biased against biases, but it turned out I was wrong. Such is often the way with cognitive biases. Reading Daniel Kahneman’s book Thinking, Fast and Slow, I learned to think of biases in a new way. His work is often cited and has become increasingly important in behavioral economics, decision science and human psychology. It turns out that what Doctors Deming and Juran taught us, that 80% – 95% percent of an organization’s problems stem from the system and not with the people working with it, is only half of the answer.

Bias means to be unfairly partial, prejudiced to leaning towards one side of an issue without factual justification. The opposite of bias is impartiality, fairness, conforming to rules and standards in how we decide, or justness. In a pleasing bit of wordplay, we can say that “to be biased” is to have an unfair or inappropriate inclination toward a thing or action, to “to have a leaning”. On the other hand, in the business excellence usage, “to Lean” means to have the property of making decisions in a way that is fact-based, impartial, fair and resulting in justness.

We often make statements “…but I am biased” when admitting a preference about a particular approach or choice about which one should supposedly be making an objective judgment. We acknowledge that we are not making a fully objective evaluation, but at the same seek and give forgiveness or even permission for holding biases as part of the human condition. In common use, bias has taken on a similar but somewhat less negative connotation than homonyms such as prejudice or intolerance. We have developed a tolerance for bias. In day-to-day use, bias has almost come to mean opinion, to which we are only too eager to entitle each other.

In the author’s words, “Systemic errors are known as biases, and they recur predictably in particular circumstances.” Kahneman’s association of bias with error implies an underlying assumption that there is a correct judgment, a correct choice or answer, and that our biases are not justified preferences but in fact errors. Kahneman’s work makes a convincing case that there is a set of biases that are rooted in our biology which lead us to make poor decisions. We need to strengthen our grasp on statistics and probability in balance with our intuitions.

As pioneer of kaizen and architect of the Toyota Production System, perhaps the greatest legacy of Taiichi Ohno was not the logistics, quality and continuous improvement systems he built at Toyota but rather the influence he had on a generation of leaders at that company. Ohno was deeply concerned with what he termed “misconceptions”. He taught that our casual perceptions are not to be trusted. Only thorough, up close observation and verification of results of one’s beliefs through trial and error would Ohno allow a student to get away with the claim “I understand”. These misconceptions are incorrect understandings which drove people and organizations to make decisions and build systems which were wasteful or did not effectively serve the intended purpose. What Ohno called “rationalization” was not merely cost-driven consolidation but “to do what is rational” based on observation of facts and challenging of practices built on our misconceptions or biases.

Taiicho Ohno passed on before he could “ask why 5 times” and delve deeper into the causes underlying these thought illusions he observed within industry. If Kahneman were familiar with Ohno’s work and interest in misconceptions, he would likely trace them through human psychology and human evolution. In many ways, the insight of Japanese industrial leaders to embrace the work of Dr. Edwards Deming so thoroughly in the post-World War II years was a brilliant countermeasure to a set of biases that continue to plague organizations worldwide in through poor decision making, poor quality and poor performance. Statistical thinking does not come naturally to humans, in the short-term instinct triumphs, and is often adequate. In the long-term, we regress to the mean.

What intrigued me about Kahneman’s sentence “Systemic errors are known as biases” is the realization that we can in fact think of any system as a set of biases. A system is a set of processes and sub-systems, and within these are various processes and rules, the outcomes of decisions. These processes run according to either correct or incorrect parameters; decisions made based on biases or facts. The systems based on incorrect assumptions are biased towards, or tend towards failures, entropy, loss, variability and waste in general. Superior systems are not only intolerant of biases, they systematically challenge them by requiring participants in the system to go see, ask why and speak with data. Lean management systems are built on better sets of assumptions, ones tested and proven to be true, and tend to minimize errors and undesirable outcomes.

Non-lean operations suffer from systemic errors such as the inability to match supply with demand, the inability to produce good quality the first time every time, the failure to maintain perfect safety, the failure to solve problems systematically and in a standardized way, the failure to respect and fully engage human potential in the endeavor of any team, organization or society. These are not due to personal faults of our leaders, they are due to systemic errors that allow us to continue believing that the way we work and live, the leaders we choose and the processes for making these choices are OK, have no lasting negative consequences, or are not in fact biases, systemic errors. Non-lean systems are often based on belief, bias and story-telling.

Lacking the statistics to support this idea, I must admit that it may be purely a figment of my personal bias (opinion). But on reflection Kahneman’s work seems to say that systemic errors within social systems such as governments, hospitals, schools and companies are there due to our cognitive biases. Call them misconceptions or false beliefs, in the end systems do not exist a priori, they are built, enabled or allowed by people. Unless systems are delivered to us innocent humans from some Platonic plane, we as people are responsible for the systems that we should blame. This seems to flip Dr. Deming’s notion that people are never to blame and that systems are always to blame, on its head. The finger points back at people. This is not such a scary idea, it just requires replacing the notion of assigning blame with one of taking responsibility.

The lean journey is a daily struggle against ourselves, our biases and our biological inclinations. The universe does not care; it is governed by statistics, which we are at liberty to remain ignorant of, replace with our beliefs or use to our advantage. The lean journey is the pursuit of systems ever more free of bias, impartial and just. To paraphrase Martin Luther King, Jr, “Let us realize the path to a Lean Operating System is long but it bends toward justness.”

2 Comments

  1. Jorge Wong

    March 20, 2012 - 7:56 am

    Jon,
    This is an outstanding article, which provides effective and ‘unbaised’ background for the 3 principles of the Toyota Way: Go there and see, show respect, ask why.
    Now, I think you partially misunderstand Deming when you write:
    “This seems to flip Dr. Deming’s notion that people are never to blame and that systems are always to blame, on its head. The finger points back at people.”
    Note: Deming never said never. Instead, he told me “you have to be practical, start with yourself.”
    In fact, Deming clearly points at working with people first. He stated: “The first step is transformation of the individual. This transformation is discontinuous. It comes from understanding of the system of profound knowledge. The individual, transformed, will perceive new meaning to his life, to events, to numbers, to interactions between people.
    “Once the individual understands the system of profound knowledge, he will apply its principles in every kind of relationship with other people. He will have a basis for judgment of his own decisions and for transformation of the organizations that he belongs to. The individual, once transformed, will:
    -Set an example
    -Be a good listener, but will not compromise
    -Continually teach other people
    -Help people to pull away from their current practice and beliefs and move into the new philosophy without a feeling of guilt about the past
    Jon, this last point goes to the core of your article, as you say: “This is not such a scary idea, it just requires replacing the notion of assigning blame with one of taking responsibility.”
    Deming’s is a good roadmap for a life-time of bias and waste elimination. Please SEE: http://deming.org/index.cfm?content=66
    Following Shewhart we need to work first with the people, as individuals, and other special causes of variation; next comes the system, as the source of common cause variation. This is the core of statistical thinking.
    I think your article is 100% aligned with what Deming and Juran said and did, unless there is a little bias :-).
    Thanks for a great article!
    Jorge B. Wong

  2. Robert Drescher

    March 28, 2012 - 10:08 am

    Hi All
    Deming never implied that people are not to blame for most problems. In fact his statement that 94% of problems being the result of poor systems, is in fact point out that certain people are to blame.
    Poor systems are not the fault of workers, but they are the fault of managers and/or engineers that created them. We need to remember that he talked far more often to management and support staff than line workers. Management and support staff have a bias to blame workers for what when wrong, and still often do. Yet what Deming was pointing out is that Management and support staff need to fix the system problems so that workers can then fix the remaining problems that are within their ability.
    Constantly trying to fix the six percent first does not achieve much in reality, so first tackling system problems you can than more easily fix the rest.
    Deming was trying to politely shift the bias that workers are to blame first to one where the system creator is to blame first.