Can We Adjust Our Way to Quality?

But there is one more dot in figure 9 that has not yet been explained.

The sources of variation

Using weights of a = 0.5, b = 0.0, and c = 0.0, the general PID algorithm becomes the simple P-controller known as Gibbs’ rule. This process-hyphen-control algorithm adjusts the process following each value by an amount equal to half the difference between the target and the most recent value. When this rule is sequentially applied to the example data set, we get the values shown in figures 7 and 8.

Six Sigma

Can We Adjust Our Way to Quality?

Process-hyphen-control illustrated

Only 14 of the 177 PID controllers (8%) did better than manually adjusting with a process behavior chart. One reason for this is that many sets of PID weights will result in an oscillating set of adjustments. With a process like the example data set that is subject to occasional upsets, these oscillations may never die out. This is why about one-third of the PID controllers increased the fraction nonconforming. It turns out that the art of tuning a PID controller is more complex than theory predicts simply because most processes are subject to unpredictable upsets. When your process is going on walkabout, it’s hard for your controller to fully stabilize.

So, can we adjust our way to quality? Clearly, many different process-hyphen-controllers were found that reduced the fraction nonconforming for our example data set. But is this all there is?

Gibbs’ rule makes 153 adjustments to the process aim. Because there were only 20 process changes present in these data, many of the adjustments made by Gibbs’ rule have to have been either needless adjustments or corrections for those needless adjustments. Nevertheless, by constantly adjusting the process aim, Gibbs’ rule managed to reduce the fraction nonconforming in this case from 48% to 20.5%.


Figure 2: Shifts used to create signals within the data set

Process behavior charts allow us to detect process upsets. Clearly, when we have an upset it’s important to get things operating normally again, so it’s natural to think of using process behavior charts to make adjustments to keep the process at a desirable level. When thinking in this manner, it’s natural to unconsciously insert a hyphen between the last two words to obtain “statistical process-control.” And the hyphen changes the meaning. Instead of a nominative phrase referring to a holistic approach to analyzing observational data, the hyphen changes SPC into a process-control algorithm that uses statistics.

Our resulting example data set is shown on an chart in figure 3 and is tabled in figure 12. The average value remains 10.11, but the average moving range is now 2.51. This results in limits that are 26% wider than those in figure 1. Despite these wider limits, nine of the 10 large shifts are detected by points outside the limits. Three more shifts are detected by runs beyond two sigma. So the X chart in figure 3 correctly identifies 12 of the 14 periods when the example data set was off target. This ability to get useful limits from bad data is what makes the process behavior chart such a robust technique.

Author’s note: This article is a corrected version of a column that appeared in Quality Digest on July 11, 2016. The earlier version suffered from a programming error that affected all of the PID results.

As the name suggests, a PID controller makes adjustments based on the size of three elements. The first element (proportional) will depend upon the difference between the current value and the target value. This difference is known as the error for the current period. Letting the subscript t denote the current time period:

To achieve full potential, we must not only adjust for process upsets but also find and control the causes of the upsets. And that’s why we’ll never be able to adjust our way to quality.

Appendix


Figure 3: X chart for the example data

Process adjustment techniques are always reactive. They simply respond to what has already happened. They can do no other. At their best they can help to maintain the status quo. Yet sometimes they actually make things worse.

Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types.

 

With specifications of 6.5 to 13.5, our example data set has 48% nonconforming. Since the data of figures 3 and 4 will be used in what follows, this fraction nonconforming will be our baseline for comparisons as we consider different approaches to process adjustment. Specifically, we’re interested in how much we can reduce the fraction nonconforming while keeping the average close to the target value of 10.

The first technique we’ll consider will be using a process behavior chart as an adjustment technique.

Statistical process steering

Thanks,
Quality Digest

منبع: https://www.qualitydigest.com/inside/six-sigma-article/can-we-adjust-our-way-quality-100223.html

Figure 11 compares what can be achieved by reactive process adjustments and what can be achieved by proactive process improvement. A process is operated up to its full potential only when it’s operated predictably and on-target. Anything less results in excessive variation, which creates excess costs. This process is capable of operating with about 5% nonconforming. While process adjustment techniques may cut the 48% nonconforming in half, they’re not capable of operating this process up to its full potential.


Figure 7: Gibbs’ rule applied to the example data set (153 adjustments)

Difference of Error Terms at time t = Delta(Et) = (Et – Et-1)

Error for time period t = Et = (Xt – Target)

With the example data set, predictable operation would be equivalent to removing the step functions of figure 2. This would leave the running record of figure 1 and the histogram of figure 10.

Adjustments are necessary because of variation. And the variation in your process outcomes doesn’t come from your controlled process inputs. Rather, it comes from those cause-and-effect relationships that you don’t control. This is why it’s a low-payback strategy to seek to reduce variation by experimenting with the controlled process inputs. To reduce process variation, and thereby reduce the need for adjustments, we must understand which of the uncontrolled causes have dominant effects upon our process. And this is exactly what the process behavior chart allows us to do.

Once we view the process behavior chart as a process steering wheel, we open the door to consider other process-hyphen-control algorithms. These algorithms are intended to keep the process on target by continually adjusting the process aim. There are many types of these devices, and they greatly facilitate all kinds of operations today. The type of process-control algorithm that will be considered here is a simple proportional-integral-differential (PID) controller of the type that has been around for more than 100 years.

 


Figure 9: The fraction nonconforming produced by various PID controllers


Figure 11: What predictable operation can achieve


Figure 6: Histogram for result of using a process behavior chart to adjust process

 

Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.

In looking for a PID controller that might do better than Gibbs’ rule, I considered a total of 176 additional sets of PID weights. Of these, 61 actually increased the fraction nonconforming! The remaining 116 controllers reduced the fraction nonconforming by various amounts as shown in figure 9.

Thus, by virtue of background, training, and nomenclature, many people have come to think of a “process control” chart as simply a manual technique for maintaining the status quo. While it’s much more than this, we’ll examine how a process behavior chart functions as a process-hyphen-controller and compare it with other process adjustment techniques.