The Technical History of Six Sigma

Dr. Mikel J. Harry


Setting the Stage
As a first point of business, let’s set the stage for our ensuing discussion with a brief orientation on the technical evolution of Six Sigma.  In this way, we can get down into the boiler room of Six Sigma to better understand its source of power. Most often, the history of Six Sigma is presented from a business viewpoint.  Such presentations are usually aimed at the leaders, managers, and executives of large-scale corporations.  In this context, the typical history lesson on Six Sigma almost invariably involves a discussion about how it found its legs at Motorola during the 1980s and then spread across the company in the pursuit of world-class quality and business success. Well, there’s another side to the story.  Today, very few know about the technical evolution of Six Sigma and even fewer are aware of what actually underpinned the development of Six Sigma.  With this thought in mind, we’ll present some of the more notable historical artifacts that made Six Sigma a reality, not just in its vision or how it initially began its life.

To this end, we’ll investigate some of the key engineering breakthroughs that silently supported getting Six Sigma to each progressive level of maturity. In this regard, we’ll also discuss the development, deployment, and implementation of Six Sigma, but from a “boots on the ground” perspective. In other words, we’re going to look back at Six Sigma through the eyes of the scientists and engineers that laid its foundation.  In this way, we can better grasp and appreciate the true nature of the Six Sigma Model (SSM).  At the same time, we’ll come to better understand how the SSM became a powerhouse for better enabling Design Engineers to configure robust products, processes and services. 

Through this author’s recollections, research, publications, experiences, successes, failures, and web-based references, the reader should be able to glean a reasonably objective picture of “what happened when, how it happened, and why it happened.” Thus, we’ll enjoy a fresh and rich context when discussing the “nuts and bolts” of what undergirds the Six Sigma Model (SSM).  So, this treatise should be viewed as a smorgasbord of knowledge that is overlaid on a chronology of key activities and events.


Telling the story
During the early to mid-1980’s Dr. John S. Ramberg, a highly distinguished professor of engineering at the University of Arizona, worked closely with some of Motorola’s statistical, reliability and quality engineers (including this author) who, at that time, were heavily engaged in constructing a foundation for eradicating product and service related defects.  This initial foundation was instrumental as a backdrop for the landmark work that was to later become holistically known as Six Sigma. As an example of such foundational work, consider this author’s development of a problem solving model called “The Logic Filters.” This model was created during the 1980 – 1984 timeframe while working as a doctoral student and instructor at Arizona State University, School of Engineering and Applied Sciences, Division of Technology.

Many thanks are given to Dr. Louis Pardini (ASU professor and doctoral committee member) for his assistance and guidance during the system’s research, development and prototype phases.  Of interest, the Logic Filters Model was first published in 1985 within a series of process improvement manuals created by this author and subsequently used by Motorola, GEG. Essentially, The Logic Filters were a progressive system of improvement tools.  The aim was to reduce a large number of input variables to the “vital few,” after which such variables would be optimized and subsequently controlled over time (using statistical control methods).  While this may seem commonplace today, it was breakthrough thinking at the time.

The Motorola Government Electronics Group officially adopted and implemented The Logic Filters in 1985.  However, this author and researcher continued to refine the system over time.  As an outgrowth of these refinements, this author and researcher created the MAIC improvement strategy (Measure-Analyze-Improve-Control). In support of this, consider the white paper entitled: “SW-DMAIC: A Six Sigma New Generation for Improving Software Development Processes” authored by Tonini, Laurindo and Spínola. This acclaimed white paper was delivered at the 19th International Conference on Production Research.  In this paper, they stated:

“The method was developed by Dr. Mikel Harry and was called MAIC (acronym of Measure, Analyze, Improve and Control). It consisted in four stages: Measure (assessment of data collection about the present process situation), Analysis (understanding the causes of the present performance of a process), Improvement (elaboration of improvement alternatives in the process performance) and Control (procedures to keep the improvements obtained and to make them long-lasting) … Taking advantage of the experience acquired at Motorola, Dr. Harry elaborated the DMAIC method, including in the original method an initial stage for defining the problem.” The first edition of this work (version 1.0) was published by this author during 1993, while versions 2.0 and 3.0 were published in 1994.  The MAIC strategy was to later become DMAIC when this author added the “D” (Define) while deploying Six Sigma at General Electric in 1995.

Many thanks are extended from this author to Mr. Kjell Magnusson, then Sr. Vice President,  Asea Brown Boveri, Ltd.  While in Germany during the course of 1993, Mr. Magnusson spent much time working with this author and executive.  Owing to his expertise as a Design Engineer for large-scale transformers; and his proven skills as a business executive, he was instrumental in assisting this author with transitioning The Logic Filters into the MAIC format.  His untiring helpfulness and zest for improvement will not be forgotten.

Laying the Foundation
In 2002, Quality Digest published an article by Dr. Ramberg entitled: “Six Sigma: Fad or Fundamental?”  In this telling article, Dr. Ramberg stated: “Smith and Harry’s initial Six Sigma umbrella included SPC, ADT and PE. Later, they added design for manufacture (product capability and product complexity) and, as quality was linked to business performance, accomplishing quality through projects.  Motorola’s design margin had been 25 percent (4s or Cp = 1.33). When [Bill] Smith noted that escaping and latent defects under this strategy were far too high, he reasoned that the disparity between actual reliability and the reliability expected at final test could be accounted for by increased product complexity and deviations of the process mean from the target value, arriving at a value of 1.5 sigma.”

Dr. Ramberg later went on to say: “As Motorola set out on its quality journey, Harry noted that the company ran into a five sigma wall. Motorola found that it could attain a three-sigma level by installing process improvement and control in its own installations, and improve this to the four- or five-sigma level through the education of its suppliers. However, Six Sigma only became possible once the company had attained a better understanding of the role of robust design – systems design, parameter design and tolerance design.”

The foundational work described by Dr. Ramberg was underscored years later in an October 2008 Motorola presentation authored by Ms. Tina Huesing, then Corporate Director of Six Sigma at Motorola.  Her presentation was entitled: “Six Sigma Through the Years.”  In this presentation, Ms. Huesing stated that in the early 1980’s: “Executives and managers were encouraged to hire statistics experts in their groups, e.g Arizona: Mikel Harry – GEG, Mario Perez-Wilson – SPS Phoenix, Skip Weed – SPS Mesa, adding to internal experts like J. Ronald Lawson, Eric Maass, Tony Alvarez, SPS Mesa and professors / consultants like Dr. Dennis Young and Dr. Douglas Montgomery from Arizona State University.  Janet Fiero at MTEC rolls out series of Statistics courses, including a course by the external consultant, Dorian Shainin which captured the imagination of a senior engineer named Bill Smith.”

The efforts of these contributors served to build the foundation for using statistical methods during the course of improving product and service quality.  The resulting body-of-knowledge (BOK) was subsequently used by the Motorola Training and Education Center (MTEC) to develop and deliver several very popular courses related to the diagnosis, improvement and control of manufacturing processes. The initial research and development for using statistical engineering methods to augment the design phase of product development was first conducted by this author (Co-Creator of Six Sigma) and Mr. Bill Smith (Father and Co-Creator of Six Sigma).  This groundbreaking work resulted in a unique BOK that was subsequently used to shape a highly successful MTEC program called: “Design for Manufacturability.”  In support of this, Ms. Huesing stated:

“Bill Smith and Mikel Harry created a class for MTEC [Motorola Training and Education Center] called Design for Manufacturability.  The main thrust of the course was to improve process capability to the point that no more than 3.4 defects per million opportunities would be created when mated with their respective design specifications.  After some initial course development and piloting, Mr. Smith and Dr. Harry collaborated to perfect the approach.  Looking back now, it’s easy to say this class was the first step in formalizing what is known today as Design for Six Sigma (DFSS).” 


Evaluating the Goal
During the period extending from 1985 through 1990, this author worked closely with Mr. Bill Smith to further define and refine the SSM.  The overarching aim of this fruitful collaboration was to improve the ways and means of achieving design robustness.  This meant making our products, processes and services resilient or otherwise tolerant to inherent sources of natural variation, especially random perturbations in process centering.

During this period of time, the idea of robust design was just coming on the scene, as some would say.  Most notable among those on the forefront of this amazing technology was Dr. Genichi Taguchi.  This author spent much time learning his philosophy, theory and practices as they related to product and process design. Over the course of time, this author collaborated with several others deeply involved in the practice of robust design, among which was Dr. Berry Bebb.  Then, during the month of November 1991, this author was selected as key speaker at the “World Class by Design” conference in New York.  Of course, the topic was on making product designs insensitive to variation. By establishing performance tolerances that are impervious or otherwise insensitive to variation meant that we could relax many of our “Trivial Many” specifications and desensitize the “Vital Few.”  From a numerical point of view, this speaks loudly to the 80/20 rule.

This simple but powerful rule says that 80% of the variation in a product’s performance is created during the course of production, but could have been easily avoided by optimizing only 20% of the contributing design factors.  So, by first conducting a component sensitivity analysis, the key component specifications can be quickly isolated and prioritized for subsequent optimization. As many know, centering a process is relatively easy during the course of production, but reducing product variations is a difficult and often drawn out affair, often being quite costly.  Beyond question, the control of variation during the design phase is countless times easier and less expensive than during the production phase.  However, the big question has been (and often still is): “How can a product or process configuration be made less complex, more robust, more reliable, more producible and less costly – all at the same time?”

Well, the answer to this question is likely not as complicated or deep as some might believe.  For example, it’s often possible to leverage component-level nonlinearities and interactions to greatly reduce the probability of a defect, while concurrently relaxing the associated specifications.  This is but one of several ways a Design Engineer can achieve higher levels of performance in terms of quality, reliability, producibility and sustainability – and all for less cost.

For any highly sensitive specifications that could not be made robust through such means, our recommended course of action was to completely eliminate the related part or component from the design configuration.  We discovered that this could be accomplished with some innovative thinking – often supported or otherwise facilitated by mathematical, statistical, and physical models that, when coupled with computer simulations and breadboard prototypes, brought about many insights into how a Six Sigma level of performance could be achieved.

Like a falling chain of dominos, the task of improving the robustness of a design can lead to significantly higher levels of reliability and the reduction of design complexity (part count), yet do so while concurrently reducing costs and enhancing customer satisfaction.  Essentially, moving in this direction quickly shifted our mindset from one of being in the Quality Business to being focused on Business Quality.  Almost naturally, this change of thinking enlarged the mission of Six Sigma in both scope and depth.


Spreading the word
In 1986 the Motorola Science Advisory Board and Motorola Corporate Engineering Council endorsed the SSM and its related performance objectives.  Also during 1986 this author and researcher published the first official treatise on the topic of Six Sigma. The booklet was entitled: “The Nature of Six Sigma Quality” and was initially published and distributed by the Motorola Government Electronics Group (GEG) before being elevated to Motorola University Press.  This was the first document on the subject that engraved details into the SSM. The second and third editions were published in 1987 and 1988, respectively.

At a business meeting in 1993, the Director of Motorola University Press informed this author that over 500,000 imprints of the aforementioned booklet were made and distributed between the years 1988 and 1993. Where the corporate world is concerned, this level of interest was a strong testament to the reach of Six Sigma, especially the incredible results it was yielding during that period of time.  As the word spread, more people came forward with innovative ideas and new ways to enhance the odds of our success. This overwhelming response to the booklet sparked (or perhaps spiked) this author to expand the scope and depth of his investigations into Motorola’s design capability.  This was done in the interest of developing new and more innovative engineering models, methods and tools.  With deep gratitude on behalf of this author, these diligent efforts did not go unnoticed by the company’s top executives.

In 1987 Motorola officially adopted Six Sigma.  Also in 1997, this author along with Mr. Reigle Stewart completed their co-development of several mechanical Design Engineering methods and then subsequently published a portion of that work under the title: “Six Sigma Mechanical Design Tolerancing,” for which both authors received a prestigious Motorola engineering award.  To this day, these methods are still being used by many corporations and universities. During 1988, Motorola was awarded the first Malcolm Baldrige National Quality Award, for which Six Sigma was highly credited.  Also during 1988, this author along with Dr. Ron Lawson developed a probability-based approach for characterizing and optimizing the producibility of a product, process or service design.

The result of this innovative work was first published in 1988 by Motorola’s Government Electronics Group under the title: “Six Sigma Producibility Analysis and Process Characterization.”  Motorola University elevated this book and assumed the publication responsibilities in 1990.  The second edition was co-published in 1992 by Motorola University Press and Addison-Wesley Publishing Company, Inc. The methods set forth in this book were formally adopted in 1989 by the United States Navy as a NAVSO standard under the title: “P-3679: Producibility Measurement Guidelines/Methodologies.”  In recognition of this work, Mr. Ernie Renner, then Director of the US Navy Best Manufacturing Practices Program, presented this author with a widely celebrated engineering award.


Accelerating the Quest
In 1989, Mr. Robert Galvin (then Motorola CEO and Chairman of the Board) asked this author (then Chief Statistician and Member of Technical Staff at Motorola GEG) to devise a strategic corporate plan to increase the momentum of Six Sigma within the company.  This author’s proposal was subsequently accepted by Mr. Galvin in late January of 1990. In April of 1990, this author and researcher moved from Phoenix, Arizona to the larger Chicago area and formed the Motorola Six Sigma Research Institute.  

At the core of this institution’s purpose was the amassing of circumstantial evidence that would give additional theoretical and empirical credence to the plausibility, feasibility and rationality of the Six Sigma Model (SSM). Based on the revised SSM, new and innovative Design Engineering models, methods and tools were developed by SSRI and then packaged into training programs by Motorola University for subsequent deployment and implementation. At the same time, several top corporations became sponsors of SSRI, like Texas Instruments Defense Group, Asea Brown Boveri, International Business Machines (IBM), Digital Corporation and Eastman Kodak.  These sponsors provided SSRI with substantial economic resources and best-in-class full-time engineers and scientists.  Thus, the scope and quantity of Six Sigma research was greatly increased.  

In turn, this drove the creation of more tools that were ultimately disseminated throughout Motorola and the partnering companies. On this subject, this author would like to say that several of the key features related to the initial SSM had been previously identified and expounded upon a few years earlier through the brilliant contributions of Mr. Bill Smith. His tremendous insights into product reliability and its consequential relationship to latent defects were manifested as empirical and experiential postulates, theories and assertions.  In this way, Mr. Smith and this author laid a strong foundation from which SSRI could develop new and innovative engineering models, methods and tools in the fields of electrical and mechanical design engineering.