Image Image Image Image Image Image Image Image Image Image

The Triz Journal | November 21, 2017

Scroll to top

Top

Applying TRIZ to Software Problems - Creatively Bridging Academia and Practice in Computing

| On 14, Oct 2002

This article was first published in the proceedings of TRIZCON2002, The Altshuller Institute Conference, May, 2002.

Kevin C. Rea, Principle Consultant
REA Consulting
E-mail: kcronline@gmail.com
Web site: http://kevincrea.com/

1. Abstract

The introduction of structured innovation for computer science is needed to teach effective software-problem solving and new concept generation – however, realistically curriculums won’t change and industry practices are usually set in stone. This paper reports the author’s research efforts to advance TRIZ in the software-problem domain. The current state of academia and practice curriculums is briefly discussed; context is introduced from two points of view: organizational processes as it relates to algorithms for creativity and an information modeling using Su-Field analysis.

2. Introduction

Solving software problems should be, in theory, systematic; yet in practice, it seems to be very much an art. This art is also a decent business especially for big software companies that sell you technical support subscriptions so you can debug their products for them! – now that’s creative.

A majority of software problems today are caused by trial-and-error programming. While formal methods apply logic and simple mathematics to programming, it is not enough to tackle the increasing complexity that software demands now and in the future. Critical systems such as missile defense, anti-terrorism detection, and life support need software that works safely and consistently per specification. Thus, the introduction of structured innovation is needed to bridge both academia and practice, ultimately to provide creative solutions for the challenges of the 21st century.

3. Current Situation

Looking at the industry, one can see that software is splintering into more and more special interest groups (SIGs); this is happening in both camps – academia and practice. Table 1 – Computing Areas of Focus lists the major focus groups of both academia and practice.

Table 1 – Computing Areas of Focus


Academia1 Practice (ACM SIGs)
Algorithms and Complexity (AL) Algorithms and Computation Theory
Discrete Structures (DS) Ada Programming Language
Programming Fundamentals (PF) APL Programming Language
Programming Languages (PL) Applied Computing
Architecture and Organization (AR) Computer Architecture
Operating Systems (OS) Artificial Intelligence
Net-Centric Computing (NC) Computers and the Physically Handicapped
Human-Computer Interaction (HC) Computers and Society
Graphics and Visual Computing (GV) Computer-Human Interaction
Intelligent Systems (IS) Data Communication
Information Management (IM) Computer Personnel Research
Software Engineering (SE) Computer Science Education
Social and Professional Issues (SP) Computer Uses in Education
Computational Science and Numerical Methods (CN) Design Automation
Distributed Systems – Concurrency (CN) Systems Documentation
Electronic Commerce
Computer Graphics and Interactive Techniques
Groupware
Information Retrieval

The joint task force on computing curricula recently published a report in the fall of 2001. The goal of this report was to develop curricula guidelines and for undergraduate programs in computing. The final report from the task force was scheduled for publication in the fall of 2001. Computing has changed dramatically over time in ways that have a profound effect on curriculum design and pedagogy. Moreover the scope of what we call computing has broadened to the point that it is difficult to define it as a single discipline. Past curriculum reports have attempted to merge such disciplines as computer science, computer engineering, and software engineering into a single report about computing education. Although the focus groups are essential in terms of defining the body of knowledge in each discipline, they’re not in themselves sufficient to solve the complex software problems that plague our industry today. There are certain aspects of both academia and practice that are out of our control: 1) Significantly changing the curriculum of computer science and software engineering in academia and 2) Changing internal processes of companies to meet a special innovation mandate. There must be some way to “bridge” what’s going on in both camps, thus harness the complimentary synergy between them. The above splintering suggests that whatever inventive methodology is used will need to account for the different contexts present in these growing groups. Synchronization between academia and practice generally do not reflect categories in the US patent fund. If we want to spawn true innovation across a society, we need to close the loop with the patent office and feed back innovation needs to academia, these patent fund categories are shown in Figure 1.


Data Processing
{USA.700 à USA.707, USA.716, USA.717)

{USA.700}
GENERIC CONTROL SYSTEMS OR SPECIFIC APPLICATIONS
{USA.706}
ARTIFICIAL INTELLIGENCE
{USA.701}
VEHICLES, NAVIGATION, AND RELATIVE LOCATION
{USA.707}
DATABASE AND FILE MANAGEMENT, DATA STRUCTURES, OR DOCUMENT PROCESSING
{USA.703}
STRUCTURAL DESIGN, MODELING, SIMULATION, AND EMULATION
{USA.716}
DESIGN AND ANALYSIS OF CIRCUIT OR SEMICONDUCTOR MASK
{USA.702}
MEASURING, CALIBRATING, OR TESTING
{USA.717}
SOFTWARE DEVELOPMENT, INSTALLATION, AND MANAGEMENT
{USA.704}
SPEECH SIGNAL PROCESSING, LINGUISTICS, LANGUAGE TRANSLATION, AND AUDIO COMPRESSION/DECOMPRESSION
{USA.705}
FINANCIAL, BUSINESS PRACTICE, MANAGEMENT, OR COST/PRICE DETERMINATION

Figure 1 – Current US Patent Categories for Data Processing

Current Areas of Research

Applying TRIZ in the software-problem domain has many potential benefits, not only to the software problem solvers, but also the TRIZ community in general. One major benefit is the concerted effort to evolve the science of TRIZ so as to avoid obsolescence.

The author is currently active in the following tasks:

  1. Brief analysis of the current situation in academia and practice,
  2. Development of algorithmic templates for software problems,
  3. Potential enhancement of TRIZ tools (Su-Fields with context),
  4. Analogies of inventive principles in software2,
  5. Software patent-fund analysis.

NOTE: Activities a, b, and c are presented in this paper.

4. Su-field Enhancement

At a low level, all software is data; whether it is for a network process like a Web server or an inter-process object like an ActiveX component. This data is then organized into something useful – information. It is this information that we try to model when discerning about software systems. There are many modeling techniques used in software engineering today. The Unified Modeling Language™ (UML) is the industry-standard language for specifying, visualizing, constructing, and documenting the artifacts of software systems. It simplifies the complex process of software design, making a “blueprint” for construction, yet it does not provide problem-solving tools. One goal of this research is to develop various templates that provide inventive tools for methodologies such as UML. Possible enhancements of Su-fields for software modeling in UML hold at a minimum, the ability to escape rigid rules and idioms.


Figure 2 – Su-field Model3

Substance-Field (Su-field) Analysis is a TRIZ analytical tool for modeling problems related to existing technological systems. Every system is created to perform some functions. The desired function is the output from an object or substance (S1), caused by another object (S2) with the help of some means (types of energy, F). The general term, substance has been used in the classical TRIZ literature to refer to some object.

Substances are objects of any level of complexity; they can be single items or complex systems. The action or means of accomplishing the action is called a field. Su-field analysis provides a fast, simple model to use for considering different ideas drawn from a knowledge base and to zoom in on the zone of interest. However, the analysis can be applied to system as well as component levels of abstraction; this is often at the interface between the two substances. For complex systems there is a Su-field model for all the zones of interest. Two substances and a field are necessary and sufficient to define a working technical system.

There are four basic models:

  • Effective complete system,
  • Incomplete system (requires completion or a new system),
  • Ineffective complete system (requires improvement to create the desired effect),
  • Harmful complete system (requires elimination of the negative effect).

If there is a problem with an existing system and any of the three elements are missing, Su-field analysis indicates where the model requires completion and offers directions for innovative thinking. If there is an innovative problem and the system has the three required elements, Su-field analysis can suggest ways to modify the system for better performance. This is particularly true if radical changes in the design are possible.

The field, which is itself often some form of energy, provides some energy, force or reaction to guarantee an effect. The effect could be on S1 or the output of the field information. The term field is used in the broadest sense, including the fields of physics (that is, electromagnetism, gravity and strong or weak nuclear interactions). Other fields could be thermal, chemical, mechanical, acoustic, light, etc. The two substances can be whole systems, subsystems or single objects; they can also be classified as tools or articles. A complete model is a triad3 of two substances and a field. The innovative problem is modeled to show the relationships between the two substances and the field; complex systems can be modeled by multiple, connected Su-field models.

There are four steps to follow in making the Su-field Model:

  • Identify the elements. (The field is either acting upon both substances or is within substance S2 as a system)
  • Construct the model.

After completing these two steps, we stop to evaluate the completeness and effectiveness of the system. If some element is missing, we try to identify what it is.

  • Consider solutions from the 76 Standard Solutions.
  • Develop a concept to support the solution.


Figure 3 – Su-field Symbols
4

Context and Time in Information Modeling – the metapattern5

Shifting gears temporarily, context-oriented analysis may be the next big wave because on the conceptual level, it can solve problems elegantly. The goal of this research is to see if Su-field analysis can be enhanced to include new fields, like that of “computation” with the notion of context and time.

[‘Metapattern’ is a formalization of context, time and validity in information models.]

Metapattern is an exciting new approach to conceptual information modeling. It is a technique for meta-information analysis and modeling that emphasizes reusability. With the metapattern, analysis is recognized as a critical activity in its own right during information system development. The metapattern is not a method for technical design or software engineering; it is a highly focused analysis tool. But precisely because it provides powerful support for analysis, the metapattern helps focus all other development and management activities during the complete life cycle of systems. Overall success results from the great precision now available in modeling, particularly through the combination of a finely grained concept of time stamping and a recursive, simple but formal concept of context.

Social psychologists have known for a long time that the overall behavior of a person is never completely consistent. Rather, consistency is limited to what is called – in metapattern terms – a context. The type that determines behavior is no longer an object-as-such but an object-in-context. Actually, context is type. This shift in the meaning of type has it advantageous consequences; first, for all conceptual information modeling; second, and practically, for operational information systems. Additionally, according to the metapattern, provisions for change are present at the smallest relevant scale of information. This means that, for any point in time, any aggregation of information can be unambiguously reported; when the time of relevance changes, the dynamics of information becomes visible. An object behaves according to a particular situation; that is, its behavior can change from one situation to another. For instance, if you only know Bill as the next door neighbor, you might be surprised to see him in action at work or visiting his mother. Another word for situation is context. Thus, primary tension shifts to what is around an object. We use the term context because most people seem familiar with it in the sense of appreciating why something is not always completely the same thing. Metapattern is a technique for information modeling that emphasizes flexibility and reuse ability. It helps the modeler stride to deliver balance between general structure (abstraction) and instances (concreteness) of information. The resulting conceptual models will always optimally accommodate requirements. The metapattern presumes that in objects behaviors are completely different from one context to another. Odd as this may look, the absence of any shared properties by an object among its contexts exemplifies the metapattern. To explain further, suppose a particular context A exists. And suppose that the equally specific object x is contained by A. An example would be Bill (x) at work (A). This relationship is shown in Figure 4 – Object in context and multiple contexts.


Figure 4 – Object in context and multiple contexts

Another context, say B, could also contain the object x. Bill (x) is either at work (A) or at home (B). Contexts A and B may overlap, as in

Figure 5 – Object in overlapping contexts and conversion of disjunctive contexts. However, the overlap can be reduced; in this case to three disjunctive contexts as shown in the following figure. ( ! = NOT, ^ = AND)


Figure 5 – Object in overlapping contexts and conversion of disjunctive contexts

In the software world, it is fashionable to distinguish between instances and types. To call an object a person, for example, usually means that a particular object instance is referred to, which obeys the object type of person. The relevant instances or elements define a set by extension. Following the extensional approach, set members may, or may not share their properties. In contrast, a type intentionally specifies a set – that is, through conditions that must hold valid for elements to qualify as set members. The assumption that Bill is of the person type, unambiguously specifies his properties. With the person type assigned to John, he too will exhibit exactly the same properties.

Figure 5, suggests that object x exists in context A. Simultaneously, however, the idea of context leads to an independent consideration of the object. This suggests an explicit relationship; that is, between a context and an object. It is this explicit relationship that I believe can contribute greatly to using Su-field analysis for software problems.

Hypothesis Problem

As an example of using Su-field analysis in the context of software, we have the problem of implementing a multi-slot communications buffer. If messages are produced at approximately the same rate at which they are consumed, a single buffer provides reasonable performance since a process would not generally have to wait very long for access to the buffer. Commonly, however, producer and consumer execution is in bursts. For example, a process writing an output file might produce several lines at once, and then do more computation before producing another set of lines. In such cases, a buffer capacity larger than one can significantly increase performance by reducing the number of times processes block (blocking is when a process is waiting for something to happen before it can advance its work). This is an example of the classic time/space tradeoff in computer systems.


Figure 6 – Communications Buffer Example

Figure 6 – Communications Buffer Example (a) depicts a Producer which will place (Deposit) a message (msg) in a predefined memory location, called the buffer or buf. The Consumer “consumes” (Fetches) the msg from the buffer. The buffer contains a queue of messages that have been deposited but not yet fetched. This queue can be represented by a linked list of an array. The problem is the Producer does not necessarily deposit messages at the same “rate” at which they are consumed.

An initial model using Su-field analysis may look like the following. Note that the field is represented in the relationship of the useful action below.


Figure 7 – Initial Su-field Model of Problem

The situation implies that we have an insufficient desired effect (ineffective complete system) – the rate at which we transfer messages is carried out with low efficiency.

Figure 8 – depicts the transformation of the Su-field, showing a buffer as a shared resource. The psychological goal of adding the context is to show that S2 and S3 are very much alike in their relationship to computation at hand. This leads me to wonder if one process can be used instead of two to perform the same task. Is it possible to use the standard solution of inversion based on what context the process is in? In this case, the process would already know when it is done depositing the data and resources would be saved.

The transformation with context shows a need for a shared resource to perform synchronization of deposit and fetch procedures. The data buffer S1 has two contextual relationships factored by time and validity. Insight into how TRIZ can be applied (inversion?) to this problem is the tip of the iceberg.


Figure 8 – Transformation with Context

The combination of Su-field analysis with the inclusion of context lends additional insight for modeling in the software domain. At present, this work is still very much under development. It is my hope that I will have sufficient data to continue along this path. Is it possible to enhance the Su-field notation to include context into the modeling process? I think so.

5. Context-Templates

Context-templates (a.k.a. meta-template) serve to complement TRIZ in the context of software problems as well as other thinking methods, i.e., – Advanced Systematic Inventive Thinking (ASIT)6.

Here are the current goals of the context-template:

  1. Integrate with the most applicable tool for solving software problems of an inventive nature (i.e., TRIZ-ARIZ, ASIT-Unification, Innovation Platform™),
  2. Have a mechanism for directing the solver to the most appropriate and strong solution as well as the most promising strategy,
  3. Provide access to important, well organized, and necessary information at any step of the problem-solving process,
  4. Force the user to follow the steps with discipline,
  5. Flexibility to add, delete, enhance a template in accordance with an emerging or changing software technology and/or with the underlying inventive tool (i.e., ARIZ-71 to ARIZ-2001),
  6. Provide a creative thought-provoking experience for people dealing with software problems.

Conceptually, the templates overlay a methodology or tool and expose contextually-relevant information. As an example: ARIZ-61 divides the creative process into three stages: analytical, operative, and synthetic. Each stage is divided into several sequential steps. Thus, the algorithm separates a single complex action into several much easier actions. It looks like this

ARIZ-61

Part One: Analytical stage

1) State the problem.
2) Imagine the ideal final result (IFR).
3) Determine what interferes with attaining the result.
4) Determine why it interferes.
5) Determine under what condition that will not interfere.

Part Two: Operative stage.

1) Explore the possibility of making changes in the object itself.
2) Explore the possibility of dividing an object into independent parts.
3) Explore the possibility of altering the outside environment.
4) Explore the possibility of making changes in neighboring objects.
5) Study prototypes from other industries.
6) Return to the original problem and widen that problem’s condition – make the transition to a more general problem statement.

Part Three: synthetic stage.

1) Change the shape of a given object- the machine with the new functions should have a new shape.
2) Change other objects that interact with the one under consideration.
3) Introduce changes into the means of object functionality.
4) Explore the implementation of the new-found principal in solving the other technical problems.


Figure 9 – Overview of Context-Template

Figure 9 – Overview of Context-Template depicts the concept of an overlay onto of existing tools and methodologies. This approach gives flexibility while maintaining a familiar space for the software problem solver. A part of the meta-template is a flowchart as depicted in Figure 10 – Abstract View of Template. This flowchart is specific to a particular software discipline procedures of a particular organization.


Figure 10 – Abstract View of Template

The goal of the flowchart is to filter particular steps that do and do not apply to a particular software discipline.

Conclusion

This paper has presented some of my personal thoughts and current research in applying TRIZ to the software problem domain. The intent of this paper is to stimulate and foster feedback on applying TRIZ to the software problem domain. Although the jury is still out with respect to the validity of these enhancements and insights, at minimum I believe they are a start. There is still MUCH TO DO in terms of applying these concepts to a significant set of software problems. I don’t profess to have all the answers, but I believe I am on to something. The introduction of the context-template provides for a way to introduce inventive principles into otherwise “concrete” schools of thought. The addition of “context” and “time” as it relates to information modeling in Su-Field analysis has significant potential and will be explored further in my research. My belief is that TRIZ does provide significant opportunities in solving software-related problems. To think the unthinkable in the evolution of software is needed to truly showcase TRIZ as a powerful in the software domain. I welcome your constructive feedback as well as any software problem ideas/challenges/insights you may have.

Footnotes

  1. Computing Curricula 2001 – Computer Science, Steelman Draft, August 2001; The Joint Task Force on Computing Curricula, IEEE Computer Society and the Association for Computing Machinery (ACM).

  2. Kevin C. Rea, TRIZ and Software – 40 Principle Analogies, Parts 1 and 2, September 2001; The TRIZ Journal.

  3. James F. Kowalick, TRIADS: THEIR RELATIONSHIP TO TRIZ. The TRIZ-Journal, 1998.

  4. Engineering of Creativity: Introduction to TRIZ Methodology of Inventive Problem Solving. Semyon D. Savransky, CRC Press LLC 2000.

  5. Pieter Wisse, MetaPattern, 2001 Addison-Wesley.

  6. Dr. Roni Horowitz, From TRIZ to ASIT in 4 steps, The TRIZ Journal – August 2001.