IBM®
Skip to main content
    Country/region [change]    Terms of use
 
 
 
    Home    Products    Services & solutions    Support & downloads    My account    

IBM Technical Journals

Special report: Celebrating 50 years of the IBM Journals
All topics > Software >

The Experimental Compiling System

Award plaque by F. E. Allen,
J. L. Carter,
J. Fabri,
J. Ferrante,
W. H. Harrison,
P. G. Loewner,
and L. H. Trevillyan

The Experimental Compiling System (ECS) described here represents a new compiler construction methodology that uses a compiler base which can be augmented to create a compiler for any one of a wide class of source languages. The resulting compiler permits the user to select code quality ranging from highly optimized to interpretive. The investigation is concentrating on easy expression and efficient implementation of language semantics; syntax analysis is ignored.

Originally published:

IBM Journal of Research and Development, Volume 24, Issue 6, pp. 695-715 (1980).

Significance:

Early compilers for higher-level languages relied heavily on the recognition of important special cases of language use to produce high-quality code. The decades of the 1970s and 1980s saw a gradual but inexorable shift from reliance on special cases to systematic analysis of code and general transformations that optimized performance.

The IBM Thomas J. Watson Research Center has been at the center of this activity ever since its researchers developed the first Fortran compiler. The Experimental Compiling System project was an attempt to push the then-available analysis and optimization technology to its limit, trying to use it to realize a “compiler-compiler” which addressed semantics rather than just syntax. This paper describes the aggressive use of interprocedural analysis of flow using summarized characteristics of the procedures or primitives, intraprocedural flow analysis (including range-analysis of values), code generation by in-lining (called procedure integration), and optimization. The optimizations included constant propagation, redundant expression elimination, dead-code elimination, variable propagation and renaming. Because these analysis and transformation steps are performed using either knowledge about primitive machine operations (which is declared) or knowledge about language semantics (which is inferred by analyzing the appropriate semantics-defining procedures), the technique is essentially independent of the language and the target machine.

The project represents a midpoint in the evolution of compiler technology, contemporary with the 801 compiler effort also at Watson and envisioned around the same time as the “partial-evaluation” compilers by Andrei Ershov of the U.S.S.R. Academy of Sciences. Its background in optimization built upon important work by IBM Fellows John Cocke and Frances Allen. The project spawned work in interprocedural flow analysis in the presence of pointers, the automatic generation of data-flow analyzers, program-dependence graphs, and static single-assignment analysis. The technology thus developed was applied to IBM's automated logic synthesis work used in manufacturing of semiconductor chips. It was also used in VisualAge® C++ product and later in Eclipse®.

Comments:


    About IBMPrivacyContact