Happy’s Essential Skills: The Need for Total Quality Control (Six Sigma and Statistical Tools), Part 2
To read part one of this series, click here.
Six Sigma
Six Sigma is a disciplined, data-driven approach and methodology for eliminating defects (driving toward six standard deviations between the mean and the nearest specification limit) in any process—from manufacturing to transactional, and from product to service.
The statistical representation of Six Sigma describes quantitatively how a process is performing. To achieve Six Sigma, a process must not produce more than 3.4 defects per million opportunities. A Six Sigma defect is defined as anything outside of customer specifications. A Six Sigma opportunity is then the total quantity of chances for a defect.
The fundamental objective of the Six Sigma methodology is the implementation of a measurement-based strategy that focuses on process improvement and variation reduction through the application of Six Sigma improvement projects. This can be accomplished through two Six Sigma sub-methodologies: Business Process Management Systems (BPMS) and Six Sigma Improvement Methodologies: DMAIC and DMADV (Figure 1).
The Six Sigma DMAIC process (define, measure, analyze, improve, control) is an improvement system for existing business processes falling below specification and looking for incremental improvement. The Six Sigma DMADV process (define, measure, analyze, design, verify) is an improvement system used to develop new processes, products, or defining customer needs at Six Sigma quality levels. It is also called Design for Six Sigma (DFSS). It can also be employed if a current process requires more than just incremental improvement. Both Six Sigma processes are executed by Six Sigma Green Belts and Six Sigma Black Belts, and are overseen by Six Sigma Master Black Belts.
According to the Six Sigma Academy, black belts save companies approximately $230,000 per project and can complete four to six projects per year. General Electric, one of the most successful companies implementing Six Sigma, has estimated benefits on the order of $10 billion during the first five years of implementation. GE first began Six Sigma in 1995 after Motorola and Allied Signal blazed the Six Sigma trail. Since then, thousands of companies around the world have discovered the far-reaching benefits of Six Sigma.
Many frameworks exist for implementing the Six Sigma methodology. Six Sigma consultants all over the world have developed proprietary methodologies for implementing Six Sigma quality, based on the similar change management philosophies and applications of tools. A partial list of 18 framework and methodologies are listed here. Eight will be detailed in this and future columns in this magazine. Additionally, I have include 37 tools and templates in this column and seven more will be detailed in the future. Definitions and examples of all are available at the Six Sigma website[1].
Frameworks and Methodologies
- Balanced Scorecard
- Benchmarking*
- Business Process Management (BPM)*
- Design for Six Sigma (DFSS)*
- DMAIC
- Harada Method
- Hoshin Kanri
- Innovation
- Kaizen
- Lean*
- Metrics*
- Plan, Do, Check, Act*
- Project Management*
- Robust Design/Taguchi Method
- Theory of Constraints
- Total Quality Management (TQM)*
- VOC/Customer Focus
- Work-out
Figure 1: Six-Sigma process improvement through the DMAIC and DMADV methods (also called an affinity diagram): define, measure, analyze, improve/design, control/verify.
Six Sigma Tools & Templates[1]
- 5 Whys
- 5S
- Affinity Diagram/KJ Analysis*
- Analysis of Variance (ANOVA)
- Analytic Hierarchy Process (AHP)
- Brainstorming*
- Calculators
- Capability Indices/Process Capability
- Cause & Effect (fishbone)
- Control Charts
- Design of Experiments (DOE)*
- FMEA*
- Graphical Analysis Charts
- Hypothesis Testing
- Kanban
- Kano Analysis
- Measurement Systems Analysis (MSA)/Gage R&R
- Normality
- Pareto
- Poka Yoke
- Process Mapping
- Project Charter
- Pugh Matrix
- QFD/House of Quality*
- RACI Diagram
- Regression*
- Risk Management
- SIPOC/COPIS
- Sampling/Data
- Simulation
- Software
- Statistical Analysis*
- Surveys
- Templates
- Value Stream Mapping
- Variation
- Wizards
Statistical Methods
The Need for Statistical Tools
The discussion of quality and customer satisfaction show how important yields are to printed circuit boards. Any loss goes to the bottom line. So what are some of the tools to help improve process yields? Process control comes to mind. Chemical processes have always been difficult to control in printed circuits. These uncontrolled factors can always creep into our processes.
All process control is a feedback loop of some sort. Nevertheless, the element that I want to focus on is the control block, or more precisely, the human decisions that make up process control.
Process Control
The first link in process control is the human link. The high-level objectives are to:
- Reduce variations
- Increase first pass yields
- Reduce repair and rework
- Improve quality and reliability
- Improve workmanship
The process control tools and methods that a person may have to work with have been listed already. Of particular importance for the engineer are the statistical tools, as seen in Figure 2. Traditionally, statistical tools have been rather cumbersome and not easy to learn. I have good news: You can now get a good statistics training from the Web, at everyone’s favorite price— free.
Even if your company has good statistical software available, like mine did with Minitab, it is only available as long as you work there. By downloading the NIST/SEMATECH e-Handbook of Statistical Methods[2] and the software Dataplot, you have an equally good tool at home that can travel with you wherever you may work. Your next job may not have any statistical tools!
Figure 2: Four main techniques are basic to selecting the right statistical tool for problem analysis.
SEMATECH/NIST e-Handbook of Statistical Methods
I was looking for Weibull Reliability Plot information and the SEMATECH/NIST e-Handbook of Statistical Methods (also called the Engineering Statistics e-Handbook) from the National Institute of Standards & Technology (NIST) popped up. After playing with it a while, I discovered it was designed for just us process engineers. The organization of the handbook follows the statistical tools that a process engineer would need to ferret out a problem; measure the extent of the problem; look for root causes; uncover how many factors are involved in the problem; postulate a solution; verify the solution; and then monitor the process to be sure the problem is gone and does not reappear. These preceding eight stages form the main chapters of the handbook (Figure 3).
Figure 3: The NIST/SEMATECH e-Handbook of Statistical Methods is available on the Internet.
The usefulness of this handbook comes from the experimental data sets supplied in it (case studies). As you read, you are encouraged to make comparisons with its statistical tools. It does this by supplying a complete statistical software program for you to use: Dataplot. When used with the data sets supplied, it then coaches you through the interpretation of the results. You can substitute your own data and look at the results. This “First we run the demo and then we run your problem” system is a very effective way to coach a person through the statistical tools.
NIST has prepared version of the Dataplot software[3] for nearly all computers’ operating systems: Windows, NT, UNIX, MAC OS, etc. It is a bit large, but the download time is worth it.
Handbook Integrated with the Software
The majority of the sample output, graphics, and case studies were generated with Dataplot. This aspect does not require you to have Dataplot or know anything about it. Most of the sample output and graphics could have been created with any general-purpose statistical program.
The case studies contain a "Work this Example Yourself" section that is implemented using Dataplot. That is, you click on a link that starts Dataplot and executes a pre-existing Dataplot macro. Dataplot is run in a separate window, so you can see the handbook pages and the Dataplot output together. Once Dataplot is running, you can also generate your own commands in addition to running the handbook generated macros. This aspect of the integration requires you to have Dataplot installed on your local system.
Dataplot can access the handbook as an on-line help system. This complements the normal Dataplot on-line help in that the handbook accesses descriptions of the statistical techniques where the regular on-line help is focused on how the technique is implemented in Dataplot.
Correlation Plots and Curve Fitting
One of the Six-Sigma (TQC) Tools is Correlation Plots and Curve Fitting/Regression. Curve fitting is the process of constructing a curve, or mathematical function that has the best fit to a series of data points, possibly subject to constraints. Curve fitting can involve either interpolation, where an exact fit to the data is required, or smoothing, in which a "smooth" function is constructed that approximately fits the data.
There is a very good shareware software for this called CurveExpert 1.4[4], authored by Daniel Hyams. The author asks serious users to register the software. It is a comprehensive curve fitting system for Windows. XY data can be modelled using a toolbox of three (linear, quadratic & polynomials to the 16th order) linear regression models, nonlinear regression models (exponential, modified exponential, logarithmic, reciprocal logarithm, vapor pressure, power, modified power, shifted power, geometric, modified geometric, root, Hoerl, modified Hoerl, reciprocal, reciprocal quadratic, Bleasdale, Harris, exponential association, saturation growth, Gompertz, logistic, Richards, MMF, weibull, sinusoidal, gaussian, hyperbolic, heat-capacity and rational function), interpolation, or spline, with over 35 models built in.
You can also build custom regression models with the 15 additional models provided (Bacon, Watts, Bent Hyperbola, BET, Beta distribution, Cauchy, Chapman-Richards, Freudlich, Gamma, generalized hyperbola, Gunary, inverse, Langmuir, log normal, Lorentz equation, sum-of-exponentials, truncated fourier series and two-paramenter bell) and compare the fit of various models or let the software pick the best for you. It can be downloaded from a number of Web sites featuring shareware software.
CurveExpert screens are shown in Figure 4.
Figure 4: CurveExpert screens: a) XY data input with ranking of top 27 models to the XY data; b) graph of the XY data with the regression fit measures of S and r; and c) the regression model, residuals and variables coefficients.
Summary
No matter what you do, learning engineering statistics will be the most useful tool in your bucket. I had the fortunate opportunity to learn it during my chemical engineering coursework and it proved to be the one tool that helped me solve problems (and get promotions).
CLICK HERE TO DOWNLOAD A PDF OF THIS ARTICLE'S FIGURES
References
- www.iSixSigma.com
- NIST/SEMATECH e-Handbook of Statistical Methods
- Dataplot is downloadable by clicking here.
- CurveExpert
Happy Holden has worked in printed circuit technology since 1970 with Hewlett-Packard, NanYa/Westwood, Merix, Foxconn and Gentex. He currently is the co-editor, with Clyde Coombs, of the Printed Circuit Handbook, 7th Ed. To contact Holden,click here.