Patterns of Evolution and Methodologies
Patterns of Evolution
Determining the life-cycle of a product or service is important; the next generations should be identified in advance so that they may be planned for at design conception. Modular design, among many other practices, will help prepare the system for the incremental step improvements on that product or service S-curve. The multi-generation product plan (MGPP) benefits from technology forecasting and better prepares the organization for the strategic and tactical resources required to improve that particular portfolio item. TRIZ (The Theory of Inventive Problem Solving) includes systematic tools for predicting system evolution. Maturity mapping and the patterns of evolution (each pattern is decomposed into lines of evolution – these are more detailed directions within each pattern) are a powerful combination that may be utilized to identify the next generations or versions of a product or service on its life-cycle curve and then incorporated into the MGPP.
There are eight main patterns of evolution in TRIZ. Systems evolve according to specific principles that have been derived by decomposing the historical progression of many products and services. One of the principles is the mono-bi-poly pattern of evolution (or reduction principle). The reduction principle may be used to identify the next generations of a particular system. This principle states that a system provides a function or a set of functions to a customer. The type and number of functions may be modulated over time to improve the product or service.
Within TRIZ, another way of looking at this pattern of evolution (increasing complexity, then simplicity) is in terms of hybridization. As a system gets more functionality and gets more complex, it splinters into more and different parts. But over time, the added functionality collapses or hybridizes back into a simpler design. This happens in manufacturing repeatedly as systems add part counts during the period of increasing functionality, then reduce part counts as designs are simplified to provide the same functionality.
It began as just a piece of wood with a length of black lead. In TRIZ language, this simple writing system is a homogeneous mono-system. When an eraser was added to the pencil, it transformed to a heterogeneous bi-system. That is, the pencil performed two different functions, writing and erasing, within the same system. Later, mechanical pencils could incorporate different colors of lead so you could write in black, red, green, and blue — using the same instrument. In TRIZ-speak, the pencil with the eraser became a heterogeneous poly-system. It performs more than one function by adding more parts and complexity.
Now you have a multi-colored lead pencil that collapses the colors back into one length of lead which, depending on the angle it is held, will write different colors. With this advancement, you now have a new heterogeneous mono-system that writes all the different colors with nearly the simplicity of writing with one color, while still providing the eraser.
The pencil demonstrates how systems evolve toward increasing complexity and added functionality, and then toward increasing simplicity with no erosion of function. The evolutionary principle of hybridization is universal — meaning it has been validated over and over again. Naturally, in the drive to be better, you try to take the best of one system or technology and mix it with the best of another to get the best of both. This is the perennial drive of the fittest, and it will not be denied.
Another example of hybridization is the work of biologists who engineer the best properties of one system into the best properties of another, while simultaneously canceling the drawbacks of each as they relate to the objective at hand. An old example is the cross-breeding of two different plant seeds – one that survives in dry climates, the other that survives in cold places. Although neither could live in a cold, dry place, the hybrid can.
Because of intentional hybridization, the plant is more robust to temperature and moisture, and this principle explains what happens next in the progression of quality and TQM (total quality management). After the practice of statistical quality control (SQC) became solidified as a viable means for improving and controlling manufacturing processes, why not apply it to other processes as well – in procurement, administrative departments, distribution and marketing? Why can’t the seed grow in more than one climate? Why can’t the pencil write in blue and green and not just black? Why can’t the principles and practices of quality control be applied outside of manufacturing?
Evolution of Methodologies
By the force of evolution, SQC expanded and diversified into all departments and functions of an organization, and the Japanese drove it there until the development of total quality control (TQC). The homogeneous mono-system (SQC) became a homogeneous poly-system (TQC) as it expanded the function of quality improvement and control to everything an organization does. (Not incidentally, around this same time when the industrial economy was growing rapidly across the globe, others were working to develop so many other aspects of what are now taken for granted as the underpinnings of business success.)
In Japan in the 1950s, the forefathers of Lean manufacturing pioneered the methods of flow, waste reduction, inventory control and operational speed. In Russia, a team of engineers developed the empirical basis for product, process and organizational innovation. Also in Japan, others developed Hoshin Kanri methods, which quantitatively connect the functions and processes of an organization around strategic priorities.
Fast forward to the last three decades, when certain families of management tools collapsed into themselves, forming simpler and more integrated versions of formerly fragmented systems. By tracing the development of TQM, you arrive at a point in the United States when all the tools of SQC became packaged together for ease of deployment and application into a set of standards known as the Baldridge criteria. This was the defining time when the system known as TQM became a big tool itself, a homogeneous poly-system that reduced defects and variation and improved the quality of products and services focused on customer needs.
Still later, the components of TQM were dovetailed with other key systems and practices, such as the balanced scorecard. After all, what good is quality improvement if you can’t trace its impact? It was around this time that Motorola began driving hard with Six Sigma, which had its beginnings as a big hammer for pounding the nails of product quality to the point of no more than 3.4 million defects per million opportunities for defects at the quality characteristic level.
After some evolutionary momentum, Six Sigma extended its data-driven reach to focus on creating significant financial return, first in the form of cost reductions born of process improvements, and later in the form of growth by its application in sales and marketing. In addition, Six Sigma injected the agenda of quality into the top executive level of corporations, materializing TQM’s former lip service to top management involvement.
With a connected system of performance metrics, hard accountability at the executive level, top- and bottom-line impact and large-scale deployment, Six Sigma achieved the dream of TQM and became the world-class mono system for performance capability.
But for all this, Six Sigma is still just an extension of the quality movement, with new functionality first made more complex but now made simpler and more commoditized through programmed deployment designs, e-learning, and other software aides and technologies.
Michael S. Slocum, Ph.D., is the principal and chief executive officer of The Inventioneering Company. Contact Michael S. Slocum at michael (at) inventioneeringco.com or visit http://www.inventioneeringco.com.