Vol. 10, No. 5

May 2008

PQ Systems
 
Contents

Reduce quality costs by thousands with new monitoring product

Quality Quiz: With a video!

Data in everyday life

Six Sigma

Bytes and pieces

FYI: Current releases

 

Send Quality eLine
to a friend!

Just type in your friend's email below:

 

Sign up
If you received this newsletter from a friend and want your own subscription to Quality eLine, click below.

Subscribe to Quality eLine

 
Software

 

   

Six Sigma and more:
Be careful what you notice

Recently, I was reading the fall 2007 edition of the Quality Management Forum from the Quality Management Division of the American Society for Quality. I’m a little behind in my reading. In her article, “Kick Start a Successful Lean Six Sigma System,” Jd Marhevko described the first step in kick-starting a successful lean, six sigma system as Define. That, of course, is the first step in the six sigma process. Within her description of that step, she noted that what gets identified gets addressed. She reminded me that what gets noticed get addressed, and what doesn’t get noticed doesn’t get addressed.

Immediately, I went back to the writings of my mentor and friend, Dr. Myron Tribus. Tribus created the Perversity Principle: “If you try to increase productivity and cut costs by imposing quantitative constraints on a system, you will only succeed in increasing your costs elsewhere in the system.” You can find this principle in Creating the Quality Service Company (Quality First: Selected Papers on Quality & Productivity Improvement. National Society of Professional Engineers, Washington, D.C.: 1992). I also referred back to W. Edwards Deming who said, “The most important figures needed for management of any organization are unknown and unknowable” (Out of the Crisis. Massachusetts Institute of Technology Center for Advanced Engineering Study, Cambridge, Mass: 1986). My personal experience is consistent with these two principles.

My wife, Carole, and I have worked with hundreds of improvement/design project teams with all kinds of organizations and communities over the last 25 years. One of the most common observations we have made is that once we help each team identify what they think is important, they almost never have any historical data that helps improve the system upon which they are working. We have concluded that organizations and communities tend to gather data that is easy to gather, rather than gathering data on the issues that are most important. Maybe the existing historic data was important once upon a time, but we have found it to be less than useful in most of the systems in which we have worked. As a matter of fact, we have commonly found that many of the problems found in systems arise because the system is responding to the data it is gathering, rather than gathering data on the issues that are important to the system.

One of the reasons for this phenomenon seems to relate to Deming’s principle above. When an organization decides it wants to be “successful” or “world class,” for example, it either consciously or unconsciously creates a surrogate measure for that goal. Then, promptly, it forgets that the measure is not the goal and operates as if moving the numbers will achieve the goal. I’m not naive enough to think that all organizations operate this way, but it is a tendency, and our experience indicates that a great many organizations and communities do operate in this inefficient or dysfunctional way. The managers of two specific organizations remind us of yet another principle.

When Jim McDonald was the president of General Motors, he shared a frustration he had with a group of us. He had trouble figuring out what was really going on in the corporation because all his direct reports gave him only good news or, at least, sugar- coated news. His solution was to continually, informally, and surreptitiously contact a group of people with whom he had connected over the years and whom he trusted. That is how he got a more robust view of the state of GM.

When I was at Ford World Headquarters, company chairman Phil Caldwell once sent through a special request to investigate a leak in our magnesium wheels. We knew of no leak in our wheels and told him so. He rejected our analysis. You see, his son had personally experienced such a leak. Although we were not impressed, we knew we better dig a little deeper. We did. We found the problem: another example of bad news refusing to go uphill. Both of these leaders were skilled at noticing. My last story about noticing comes from tradition.

In my ethics class, I introduce the story of the blind men and the elephant. Upon experiencing the elephant, one man thinks it is a pillar, another a rope, another a solid pipe, and so on. There are many related but different stories of the blind men and the elephant. If you want to know more, I suggest you Google the “blind men and the elephant.” The common point of this story, of course, and those of McDonald and Caldwell is that if you want to really understand something, you better look at it from many perspectives.

This month, I really am advocating that you incorporate three principles into your Six Sigma effort:

  1. Be careful what you notice. What you notice will get attention and what you don’t won’t. And what you don’t notice may even come back to bite you.
  2. Be careful how you notice. Remember “The most important figures... are unknown and unknowable.” Use any surrogate measures with extreme care.
  3. Be careful to notice robustly. Actively seek as many perspectives on what you care about as possible. Your learning and improvement effort will benefit greatly.

As always, I treasure your comments and questions. I’m at support@pqsystems.com.

 

Copyright 2008 PQ Systems.
Please direct questions or problems regarding this web site to the Webmaster.