# A B C D E F G H I K L M N O P Q R S T U V W X Y Z 


1.5 Sigma Shift

The notion that the Z score for process capability ought to have 1.5 arbitrarily added to it. This increases the quality indices by .5. A process with a Ppk of 1.1 then has a Ppk of 1.6 for no apparent reason.


2^K Factorial Experiment
A particularly efficient version of General Full Factorial experiments. Each input variable is allowed in only one of two states. The total number of observations for a full factorial replicate is 2^K where K is the number of input variables.


TPS tool for uncluttering the workplace, and making sure that everything that is needed is immediately available.


Action Plan

QuikSigma software splits the traditional FMEA form. The Action Plan is the right-hand portion of the traditional FMEA. It lists the improvement actions that will be taken.


Alpha Risk
The risk of saying that some variable has an effect, when actually only normal random variation is at work.


The risk of saying that some variable has an effect, when actually only normal random variation is at work.


Analysis of Means. An excellent Process Behavior Chart based method for accomplishing an end similar to that of ANOVA.


Analysis of Variance. A method for testing if one or more subgroup means is different enough from other subgroup means to be counted as statistically significant.


Badly Structured Data

Data where input variables are confounded, levels of the input variables are not appropriate, etc. In short, the type of data we often encounter in real life.


Bartlett’s Test
Similar to the F test (ratios of two variances), but capable testing whether multiple variances are different from each other.


The current state of the Key Metric.


Beta Risk
The probability of missing an event if it happens.


The difference between the mean of many measurements taken by the system and the true value. If there is no bias, the system is accurate.


In 2^K Factorial Experiments, a method for getting an extra variable into the experiment for almost no cost. Also a strategy for subgrouping input variables so as to reduce the effect of irrelevant variables.


Box and Whisker Plot
A plot associated with data having an Xc input variable. Half the data falls within the box, the bar across the middle of the box is the data median, and the whiskers reach to the last data point within 3 standard deviations of the median. Standard deviation is estimated from semi-interquartile range in this case. Data past the whisker is represented with an asterisk.


Business Case
A two or three sentence statement of why it is important to do a project.


C&E Matrix

Cause and Effect Matrix.


Capability Study
Compares the Voice of the Process (histogram) with specification limits to test whether items produced are likely to be satisfactory.


Categorical Data
Information that can be put in categories, such as “round” or “square”. This is the weakest type of data.


Cause and Effect Matrix
Tool that normally follows the Process Map. Prioritizes input variables, and carries the important ones forward to the FMEA.


Center Point
Point placed at the midpoint of all the input variables in a 2^K design. Used to test whether curvature is present.


Central Limit Theorem
An important mathematical theorem that is the basis for Confidence Intervals and much more.


Characterization Experiments
Sometimes called confirmatory experiments. Uses the output of a screening experiment to select variables, then refines and confirms the model.


Chi Square Test
Tests whether counts of data arranged in a two dimensional matrix could have come about by random chance. Tests for row-column independence.


When interval/ratio input variables are really each other, but under a different name, colinearity exists. It makes the model less reliable.


When two or more input variables are inextricably mixed they are said to be confounded with each other.


Continuous Data
Subset of interval and ratio data. Can take on an infinite number of values. An example is the true air pressure in a tire or the temperature of an oven.


Control Chart
What most people call a Process Behavior Chart.


Control Plan
Essential for all process improvement projects. It lists the variables that require formal control, the improvements made in those variables, how a deviation from ideal will be detected, and how the deviation will be responded to.


One of the four quality indices. Indicates what the process capability would be if special cause did not exist and the process mean was centered between the specifications.


One of the four quality indices. Indicates what the process capability would be if no special case were present.


Critical to Quality
Those things that the customer desires or requires.


Critical to Quality.



A characteristic of an item that deviates from the desired state.


An item that is not acceptable.


Defects per Opportunity
The number of defects found divided by the number of opportunities that exist for defects.


Defects per Unit
Number of defects found divided by the number of units searched.


Degrees of Freedom
Essentially the number of equations you can write without overspecifying the system. Usually the number of data minus one.


Descriptive Statistics
A shorthand way of describing a data set by making statements about where the center is, how dispersed the data are, and/or how the data are distributed.


Design of Experiments
An investigation into the behavior of a system, using wisely structured data


An extension of Design for Manufacturing (DFMA) which includes Design for the Environment, Design for Service, etc.


Discrete Data
Subset of interval and ratio data. Occurs in discrete steps. Example is the value of the coins in your pocket. All interval/ratio data is discrete once it is written down or entered into a computer.


Design of Experiments


Defects per million opportunities


Defects per opportunity.


Defective Parts per Million


Defects per Unit


Economical Order of Corrective Actions

Challenge the specifications, eliminate special cause, center the process mean between the specifications, reduce normal random variation.


The best state of the Key Metric that has been seen, or can be reasonably imagined.


Estimated Data
Data derived from statistical estimates. May have some error, but may also be more obtainable than perfect data.


Evolutionary Operations. Many small designed experiments used to characterize a process without shutting it down or creating excessive scrap.


F Test

Tests ratios of variances. If the ratio is far enough from one, the variances are judged to be unequal.


Failure Modes and Effects Analysis
A tool for studying the severity, frequency of occurrence, and the detectability of failures. It leads the user to prioritize potential action items and to focus on those that are most likely to have great impact.


Ronald Fisher

British statistician who developed ANOVA.


What the model predicts.


Failure Modes and Effects Analysis.


Four Rules
The four rules upon which TPS is based. All work shall be highly specified as to content, timing and outcome, every customer-supplier connection must be direct and unambiguous, the pathway for every product and service must be simple and direct, and improvements must be made in accordance with the scientific method, under the guidance of a teacher at the lowest possible level in the organization.


Fractional Factorial Experiment
An experiment where some fraction of all combinations of input variables occur.


Full Factorial Experiment
An experiment where all combinations of input variables occur.


Gage Repeatability and Reproducibility

A tool for testing the precision of a measurement system.


Gaussian Distribution
The Normal Distribution or “bell curve”.


The state that of the Key Metric that the project is designed to produce.


William Gossett
British statistician who developed the T Test, and published it under the pseudonym “Student”.


Hagen’s Caveat

When a defect is found in an item, those responsible for testing seldom continue to test the item. Hence, defects that are located farther down the checklist will be under-represented in a report on the frequency of defects.


Half Fractional Experiment
A Designed Experiment where only half the possible combinations of input variables occur.


Highly Correlated
One or more input variables precisely predict an output variable.


Intraclass Correlation Coefficient

X bar and R based method for evaluating measurement system precision.


I-MR chart
A Process Behavior Chart which plots individual values on the upper chart, and moving ranges on the lower chart. Also known as X-MR.


Inference Space
The region that has been investigated and characterized in a statistical model.


Inferential Statistics
Statistics for decision making.


A value of a Y variable that is substantially different from what would be expected by linear combination of the input variables.


Interval Data
Numerical data where there are equal intervals between integers on the scale. The zero point is arbitrary. Examples are temperature in Celsius, the year, A.D. Can be added and subtracted, but not multiplied and divided.


Ishikawa Fishbone Diagram
A Lean-TPS tool for finding the KPIVs for a single KPOV.



Sudden, large improvement.


Constant, gradual improvement.


Demand signal.


Kappa Test
A test for degree of agreement, used to see if items can reliably be placed in the same categories.


Key Process
The input variables that influence the outcome of a process.


Key Process Input Variable. The knobs that can be adjusted to change the output of the process.


Key Process Output Variable. The meters you would watch to determine how well or poorly a process step has gone



The Americanized version of TPS.


Levene’s Test
Similar in purpose to Bartlett’s Test, but not so sensitive to nonnormal distribution of the data. Tests whether subgroups of data have similar variances.


Likert Scale
A form of survey question that asks people whether they strongly agree, agree, feel neutral, disagree, or strongly disagree with a statement.


Long Term Data
Data taken over a long enough period that the input variables have an opportunity to express themselves.


Main Effect

In an experiment, the effects due to just the input variables.


Mean Square
In ANOVA, the sum of the squared deviations divided by degrees of freedom. This is a variance.


A measure of where the center of a data set is. The sum of all the data, divided by the number of data.


A measure of where the middle of a data set is. The middle value, or 50th percentile, of a set of data.


The value that occurs most often in a set of data.


Moving range.


Shorthand for the average moving range.




Multiple Linear Regression
Builds a model for interval/ratio Y given a set of data with more than one interval/ratio input variable.


Negatively Correlated

When the input variable increases, the output variable decreases.


Random variation.


Nominal Data
Same as categorical data. Data with a name instead of a number.


Normal Distribution
The Gaussian Distribution or “bell curve”.


Observational Study

An investigation into the behavior of a system as it normally operates. This normally requires that the data are not structured as the would be in a Designed Experiment.


Operational Definition
Tells where the number used for the Key Metric come from, and how calculations are made to produce the Key Metric.


Optimization Experiments
Experiments aimed at finding the optimum operating point for a process.


Ordinal Data
Categorical data with an order that makes sense. Small, medium and large, for example.


P Chart

A type of Process Behavior Chart showing a proportion, such as the proportion of items that are defective.


P Value
The probability of getting a result at least as far from the mean as the point being studied.


Pareto Chart
A histogram, sorted in descending order of frequency. Often used to discover where, when, and by whom defects are created.


Pareto Principle
A minority of inputs cause the majority of change in the outputs. 80/20 Rule.


Perceptual Map
A visual method for seeing how various alternatives compare, based on two or three input variables. The input variables become coordinate exes, and the alternatives are placed in space according to their characteristics.


Perfect Data
True, complete data with no error.


Poka Yoke
“To make mistakes impossible”.


Positively Correlated
When the input variable increases, the output variable increases as well.


The probability of detecting an effect if it happens. 1 minus beta.


One of the four quality indices. Indicates what the process capability would be withthe process mean centered between the specifications.


One of the four quality indices. Indicates that actual performance of the process.


Precision to Tolerance Ratio
A measure of whether a measurement system can discriminate between acceptable and unacceptable items.


The smaller the standard deviation of the random error in a measurement system, the better the precision.


Process Behavior Chart
A tool developed by Walter Shewhart to separate normal random variation from extraordinary variation. He used the tool to find which processes had changed and therefore probably needed his attention.


Process Map
Finds CTQs and Key Process Input Variables, and delivers them to the Cause and Effect Matrix for prioritization.


Product Function Map
Replacement for the Process Map when doing DMAIC Design for Six Sigma. Instead of process steps, it uses product functions.


Project Charter
The root document that defines what will be done in a project, what the expected benefit is, and which states why the project is important to the organization.


Proof of Performance
The simplest experiment. Simply shows that changes to the process have brought about real, beneficial change. Usually compares the process before and after improvement.


Pugh Concept Selection
A simple matrix for evaluating which design concept will best serve.



The QuikSigma software approach to doing Designed Experiments. It’s much easier than any other approach we have seen.


QuikSigma software’s implementation of Plackett-Burman experiments as taught by Dr. Donald J Wheeler. These experiments are extremely cost effective.


QuikSigma® Software
Software designed specifically to organize and automate the process of doing a process improvement project.


Randomly Structured Data

Data in which all combinations of input variables are equally likely to occur.


A measure of dispersion of data. The largest number in a data set minus the smallest.


Ratio Data
Interval data with a meaningful zero point. Can be added, subtracted, multiplied and divided. Examples are the amount of money in your pocket or the weight of a sample.


Rational Subgroup
A subgroup of items made under circumstances that are as nearly identical as possible.


Inferring the equation that connects one or more interval/ratio input variables with an interval/ratio output variable. The input to regression is a set of data showing several to many examples of the combinations of input and output variables.


In Measurement System Analysis, the “test-retest error”, or the ability of the fundamental measurement tool itself.


In Measurement System Analysis, the ability of the system to get a precise answer when different operators make the measurements.


The error in a model. It is the difference between what the model predicts (fit) and what actually happened.


Root mean square.


A process or system that is not much influenced by changes in the input variables is robust to those variables.


Rolled Throughput Yield
An appropriate measure of how much “hidden factory” an operation has. It is the first pass yield of each of the steps of the process, multiplied together. This is also the probability that a single item will pass through the entire process without rework.


Rolled Throughput Yield


Screening Experiments

Quick and rough experiments designed to find which potential input variables are active, and to make a rough model. QuikScreen is a good example.


Semantic Differential
A form of survey question that allows people to express an opinion on a linear scale.


Short Term Data
Data taken over a short enough period that that one or more major input variables do not have an opportunity to express themselves.


Real process change.


Acronym for supplier, Input, Process, Output, and Customer (or Client). The Process Map is usually preferred over SIPOC.


Six Sigma
A customer centered, systematic, data driven way of doing things better. Started in the 1980’s at General Electric. A six sigma process will produce 3.4 defects per million opportunities.


Every Project Charter should be tested against the SMART acronym. Each project should be specific, measurable, attainable, relevant to organization objectives, and time bound.


Spaghetti Diagram
TPS tool that tracks the path of an item through a process. If the resulting diagram looks like spaghetti, then corrective action is indicated.


Stable and Predictable
A process that exhibits only normal random variation. A process where the next point can be predicted within limits.


Standard Deviation
A measure of dispersion of data. The square root of variance.


Star Point
Points added to a 2^K design after a center point indicates that curvature is present. Converts the design to a Central Composite Design.


Statistical Tolerancing
The practice of statistically managing the variation of parts that need to fit together.


T Score

The number of standard deviations something is from the mean, when using the Student’s T Distribution as a model.


T Test
Actually three tests. The 1-Sample T test evaluates whether some chosen number is likely to be the true mean of a population, based on information from a sample. The 2-Sample T Test evaluates whether two samples were likely drawn from different populations. The Paired T Test is similar to the 2-Sample test, but is for data that occur in natural pairs.


Taguchi Loss Function
An idea developed by Gauss and popularized by Taguchi. Basically, it says that the farther you are from the ideal operating point, the more your process costs society.


Takt Time
The “drumbeat of the market”.


Adjusting a process that is stable and predictable. Tampering always increases variation in the long run.


Theory of Constraints
Eliyahu Goldratt’s commercialization of the results of Linear Programming. Teaches that bottlenecks are more expensive than most people think, and that in order to optimize the whole system you will almost always optimize one or more of the subsystems.


Thurstone’s Method
A method for deriving interval ratings for various alternatives, given the proportion of times each alternative was preferred to the others.


Theory of Constraints.


Tolerance Stackup
See Statistical Tolerancing.


Toyota Production System
An elegant system which Toyota uses to run their operations. Based on the Four Rules.


Toyota Production System.


U Chart

A type of Process Behavior Chart showing the number of occurrences of something per unit. Most commonly used to show defects per unit.


Unbalanced Data
Data with categorical inputs, and unequal numbers of data for some combination of input variables.


The constant size field which is searched for defects.


The intrinsic value that a product or feature provides.


Value Stream Map

A tool from Lean-TPS that lays out a process, and identifies the total time and the productive transformation time for each step. More thorough versions include such things as communication lines, machine loading, inspections and rework.


A measure of dispersion of data. The sum of the squared deviations per degree of freedom.


Voice of the Customer


Voice of the Customer
What the customer values or demands. Often expressed as specifications. Takes into account the notion that pleasing the customer often involves trading one desired attributed for another.


Voice of the Process
The natural behavior of a process. Often expressed through a Process Behavior Chart or a Capability Study.


Voice of the Process


Value Stream Map


Donald J Wheeler

Respected author and statistician, and today’s most prominent expert on Process Behavior Charts.


Wisely Structured Data
Data structured in such a way as to allow a maximum number of valid conclusions to be drawn. Examples are 2^K Factorial Experiments and Central Composite Designs.


X Input Variable

QuikSigma software’s designation for an interval/ratio input variable.


XBar and R Chart
Arguably the strongest of all the Process Behavior Charts. Charts the means and ranges of rational subgroups over time. Traditionally, subgroups of five are very common.


XBar and S Chart
A Process Behavior Chart. Similar to Xbar and R, but uses the standard deviation of the subgroups rather than the range to set limits.


Shorthand for an extimate of a group or subgroup mean.


Xc Input Variable
QuikSigma® software’s designation for a categorical/nominal input variable.


Y Output Variable

QuikSigma® software’s designation for an interval/ratio output variable.


Z Score

The number of standard deviations something is from the mean, when using the Normal Distribution as a model.


# A B C D E F G H I K L M N O P Q R S T U V W X Y Z