Monitoring Process Output using Statistical Process Control Method

One of the most effective ways to monitor a key process output performance is using statistical process control (SPC) chart.  This method was invented by Dr Shewart nearly a century ago and it is the most frequent used process control method with some enhancement over times.  Control chart is actually a run chart with a calculated upper, lower control limit and process average from the actual key process output. The control limit must be calculated under the influence of stable common cause variations.  The 3 lines,  upper and lower control limits and average will be used as a guide post to show the presence of  special cause variation in the process which required immediate attention.  The interpretation of  the control chart is available in my previous article (  In order to have an effective SPC to monitor the process,  manufacturers  must  create a proper procedure on how to manage SPC implementation.

After auditing more than 100 suppliers’ sites which produce electronics parts (first and subtier) across the globe, I had NOT seen a decent SPC procedure yet.  Some companies does not even had SPC procedure and some companies SPC procedure only contain  text book  information on the type of control chart and how to plot control chart with calculation of control limits.   SPC procedure is NOT about how to plot control chart, it is should be about how to plan, implement and manage SPC within the process.  Each site/company should establish its own SPC procedure and not make copycat procedure from SPC textbook.

The table below shows some of the recommended content for SPC procedure in detail :-

Identify person in charge of SPC.
There must a dedicated department or at least dedicated team who is responsible for the implementation of SPC
SPC training for employees
Outline the SPC training curriculum for different level of employee in the company such as operator, technician, engineer and even management.
Select the parameter needs to be controlled

The most effective method to determine which process output parameter should be monitored and controlled would be through failure mode effect analysis (FMEA) technique.  FMEA is a systematic prediction use to identify potential failure which will impact customer and what are controls needs to minimize or eliminate those failure risk.  There also other method which include product mapping, brain storming etc. 
Setting up of control chart, chart selection, calculation of control limit, rationale subgrouping.
Select the most suitable type of chart (attribute or variable chart), rationale subgrouping by category (either by machine or line or tooling) and subgroup size follow by control limit calculation.  
There were a few good article rationale subgrouping by Dr DJ Wheeler in the internet
The  SPC control point must updated into the process management plan
Manage control limit

The responsible SPC person must be able to determine when to fix a control limit according to the process nature.   Normally it is recommended to study the trend of 100 subgroup points under the influence of common cause variations before fixing the control limit.
Once control limit is fixed, there should NOT be any revision on control limit done unless there is major improvement in either one or a few process input. 
Define out of control (OOC) rules

Not one manufacturing process in the world are able to use all the 7 Western Electric rules as it would be too complicated to control the process and there will be too many false alarm.  Normally I would recommend companies to use only 2-3 rules to avoid complication
Reaction plan when there is OOC
If there out of control trend per the company define out of control rules, there should be an investigation conducted to check if there is presence of special cause.  Efforts should be taken to eliminate unwanted special cause and to bring the process under the influence of only common cause.  There must be clear ownership to drive the problem to closure preferably through the company correction action request system.  Refer to my previous article on corrective action (
Review the effectiveness SPC chart
Check and balance to ensure SPC s implemented correctly and effective
There should be review conducted monthly or quarterly to ensure the effectiveness of the SPC :-
False alarm are  within preassigned goal for false alarm
SPC chart is effective to catch special cause defect, through correlation of the any special cause which happen. 

Ironically there are many companies create SPC chart just to full fill customer requirement of using SPC chart to monitor the process.  Upon a closer look at the chart, there are so many faults associated to the control chart such as:-

  1. Control limit are actually spec limit, 
  2. Out of control trends/points not investigated, 
  3. Chart is show cyclic trend of up and down due to wrong subgroup category
You can gain more insights on how to use SPC as an effective process monitoring system by taking this course : - 
Click on this image for course URL @ USD12.99 👇

If a SPC chart does NOT serve its purpose to detect a special cause variation, then it would be better to just remove the chart totally rather than to waste resources to maintain and print the chart which does not bring any value!

Process capability index show 1.33 during pilot run; still have more than 10% reject rate

When I was working for a multinational corporation as a supplier quality engineering manager,  I had seen many cases where procurement  are struggling in getting consistent supply from some key component supplier even though they meet the goal of 1.33 for key parameters.  The reason given by the supplier was they have poor yield rate of less than 90%. In my previous article, we have learnt that process capability index number actually correspond to potential reject rate percentage.  If the process capability index is more than 1.00, there should be less than 0.27% reject rate or more than 99.73%  good parts.   So by right if supplier reported their process capability index PK as 1.33 this means they have about 99.99% yield rate.    So where are the gaps? 
Since the reject rate is estimated from a sample, therefore we will not have an exact match; however it should be close such as less than 0.1% reject rate.   There are a few reasons why the reported process capability index does not match or even come close to the projected reject rate :-

  1. Specification which is too wide.  The specification derived does not reflect with actual customer requirement, specification tolerance could be too loose.  When specification tolerance is too wide it would be very easy to achieve process capability index of 1.33
  2. Inaccurate quality metrics data.  In order to obtain the data to generate a process capability index, we measured the selected quality metrics and the measurement process contributed too much process variation.  Inaccurate measurement data will lead to inaccurate process capability index.(Refer to my article dated 28 Sep 2017 on the importance of good measurement process
  3. Bias Sampling. Sample selected to calculate process capability index is NOT random sample and does not represent the actual population.   Almost all suppliers I have worked with had cherry picked parts during new product (NP) trial run stage which could meet the process capability index goal of 1.33.  Later in actual production they have high reject rate  which could be >10% and have trouble in meeting the delivery schedule
  4. The sample during NP stage violated the following assumptions for process capability to give a meaningful reject rate. The data is NOT normally distributed.  The data is NOT from a stable process which is free from special cause. 

Among the 4 reasons given on why there is a mismatch between Ppk value and reject rate,  the most common reason are related to cherry pick measurement data and the data is not normally distributed as a result of cherry pick.  Therefore validation must done on the process capability index report provided by process engineering or suppliers:-

  1. Measurement data gage repeatability and reproducibility - Ensure the measurement data collected is accurate and within the requirement of GR&R goal < 10% to 30%.  Request all the raw GR&R measurement data from supplier/process and check the data using statistic software
  2. Plot a histogram or distribution chart on the measurement data given for at least 30 samples and check the distribution.  If you get the distribution pattern other than figure 1, then most probably screened data had been used.  This type of data usually does NOT represent the population distribution, therefore process capability index value generated is does not give an accurate projection on the reject rate.

In order to get a meaningful process capability index value which reflect the actual quality of the process, we must ensure that the sample must represent the actual population.  Remember that we will never know the true population performance and we are relying on sample to make a correct inference on the population.  This is also the case with process capability index.

I have recorded a full course on process capability analysis and share with my student for free @  Udemy,  You can click on the below image or this link to enroll in the course 
This is for limited time only.

Process output monitoring: What does Process capability index > 1.33 mean

By now my reader should have a better understanding that what is variations, process,  its input and output. In order to achieve consistent good quality product from a manufacturing process, it is imperative to manage and control all process inputs – man, machine, method, material, measure and environment.  The next question is, after we control all process inputs, how do we know if we have produce consistently good quality part per customer specification or requirement.  The only way to know is to measure the output produced and collect measurement data for analysis.

One of the widely use measurement data analysis method to determine product quality is process capability.  In this method process output are measure for its quality characteristic such as dimension and compare the distribution of data with a predetermine specification.  Specification is either given by customer or derived base on customer requirement. 

A simple example would be the molding process of hand phone plastic cover.  The quality metrics in this case would be dimension such as length or width of the cover to ensure it can fit properly to the LCD assembly to become a complete hand phone assembly.  If the length specification provided by the design team is 140 -145 mm with 142.5mm is the target, the molding process need to produce part which is between 140 to 145 mm.  Since it is not practical to measure every part length, therefore a sample which must represent the population need to be measured to check if the length dimension between 140-145 mm.  The recommended sample size is at least 30.  Once data is collected then a histogram chart is plotted base on the data, which usually will form into a bell shape curve known as normal distribution per figure 1.  Assume that the current process is able to produce most part center to the target value of 142.5 mm, therefore  will be peak  around 142.5 mm. The peak where most of the data are center is known as central tendency in descriptive statistic.  Then we will compare the 6 sigma process spread with the specification tolerance.    
Figure 1 Normal distribution of measurement data length

If the process spread is less than specification tolerance in per figure 2 then there are higher chances hwere most of the parts will meet specification.  The comparison of specification tolerance with process spread is known process capability index, Ppk or Cpk.   If the process spread is more than product specification as in figure 3, then anything outside the product specification is consider as reject.  The reject rate will be higher in this case compare to figure 2.

Figure 2.  Process spread width is smaller than specification width,
almost all parts are in specs

Figure 3.  Process spread width is bigger than specification,  
there are parts that are out of spec

The process capability can be used to estimate the manufacturing process reject rate.  The universal accepted process capability index ratio between specification tolerance and process spread is 1.33. Below are more commonly used process capability and their corresponding potential reject rate for the measured quality metrics.  Commercial statistic software will be able to compute the estimate total reject rate once the process capability is generated.

In order for the process capability index number to give a meaningful estimate of the population reject rate there are 3 conditions which must be full fill :-

  1. The data should be a variable data ( refer to my blog dated 5 Oct 2017
  2. The data must be normally distributed
  3. The data must derive from a stable process which is free from special cause.  (refer to my blog  dated on 8 Sep 2017

In most cases it is impossible to measure every single production part which gives an accurate reject rate, therefore we need to use the ratio of specification tolerance to actual process spread, Ppk to estimate the total production output reject.  There are many organizations set the goal of process capability, Ppk goal of 1.33 for product output parameter, however a lot of them do not know the actual meaning of Ppk 1.33 and much less able to full fill the 3 conditions above to give meaningful estimation of reject rate for a process.

I have recorded a full course on process capability analysis and share with my student for free @  Udemy,  You can click on the below image or this link to enroll in the course 
This is for limited time only.

Please note that this article does NOT give the technical or calculation of process capability.  There are many sources which is able to furnish this information.  The intent of this article reiterates the translation of process capability number into a practical conclusion which management understands.

Is six sigma approach a SCAM?

In this article I will be opening a Pandora box on why six sigma failed in recent times after its proven success in the last century. 

In the late 80s Motorola pioneer an iconic problem solving technique name six sigma, a quality improvement methodology through variation reduction with goal to achieve 99.99966% acceptance.  This mean the process will produce part within 12 sigma range of a normal distribution and reject rate is about 3.4 dppm.  It was indeed an epic achievement given the limitation of the design and manufacturing technology at that time.  With such low reject rate, Motorola boasted cost saving which means more profit.  Fast forward to this millennium, the once glory company had to be taken a part due to billions USD lost for a few consecutive years.
There is also news on several companies which had failed even with deployment of six sigma:- 

  1. GE - Under leadership of Jack Welch, the company had adopted six sigma as the business management strategy and unfortunately almost 60% of GE Six sigma initiatives failed to attain the desired goals.
  2. Ford -  Suffer from losses even they had deploy Six sigma and design for six sigma (DFSS)
  3. Home Depot - Their former CEO Robert Nardelli was ousted due to his obsession with six sigma methodology to be used for solving every problem.  This had cause misery for the worker which directly impact consumer since Home Depot is a retail business.
  4. 3M - When a former GE executive become the CEO of the company, he instill six sigma methodology in all area include design.  The design had claimed that six sigma had  hinder creativity and innovation

With all above mega corporations’ failure, could it be attributed to six sigma actually does not work or is there other reasons.  I had managed to think of a few reasons for the failure of six sigma initiative :-

  1. Top managements that are too rely on six sigma as a business management tool.  Six sigma does not yield good results in a business environment as there too many uncontrollable factor which is not possible to address.  There are instances where we had identify the causes or factors which could impact an output, however those factors are uncontrollable which mean it is near impossible to have action plan.  In business situation, uncontrollable factors are more prevalent than controllable factors.  Therefore six sigma is cannot never effective in producing desire results when there are more uncontrollable factor* than controllable factor.
  2. One shoe fits all attitude could be because six sigma could be the only problem solving approach a company executive know without in depth knowledge what six sigma is all about. They expect all employee must spend time to force fit six sigma technique to all problems.  This had cause misery to employees.
  3. Lack of competent six sigma champion or top management to lead the quality improvement initiatives.  I had seen organization  hire incompetent  six sigma director/ champion who do not know what they are doing or does not have good knowledge in statistic.  Statistic is the soul of six sigma.  Most of these organization does not have expertise to validate the competency of the six sigma champion they are hiring.  In turn this type of six sigma champion end up unable to guide employee to use correct tools to effectively resolve problem.  There are instances that some champions  insist on using all tools that had been taught in six sigma in a variation reduction project!!
  4. Inaccurate data which lead to wrong solution.  A lot of people had forgotten the fundamental of six sigma is all about collecting accurate data.  Lack of discipline worker who does not goes all length to collect and validate data accuracy on the sample to enable an accurate prediction of population behavior.  
  5. Poor support from executive management level and expect the six sigma champion to kick everyone to participate in the six sigma initiative.   
  6. The design people claim to be restricted by six sigma where they have to play by the rules to ensure manufacturing is able to produce six sigma quality.  In actual fact, design is one of the major culprit of quality issues which had cause manufacturing process unable to produce consistent good products. 

Six sigma approach is still one of the most powerful tool to achieve consistent quality by reducing variation in the process input which could impact product quality.  It must be applied correctly with accurate measurement of quality metrics with  data collection,  understand the influence of controllable and uncontrollable factors through root cause analysis before assign any action plan to for improve.  If six sigma is applied correctly, it will not only reduce product quality variation, it can also predict future population quality as well.

Side note*

One of the most effective ways to manage uncontrollable factor is through unconventional knowledge such as using Chinese metaphysics to predict business outcome.  Since I am also a practitioner of Chinese metaphysics, I have clients who request my service to predict business outcome using tools in Chinese Metaphysics. Feb 4 signifies the arrival of spring in Chinese solar calendar and a new beginning.  Would like to wish all my reader have a great year ahead in 2018 and thanks  for the support to my website.

Sharing is caring

Continuous Improvement Program CIP - 6sigma Methodology