|Common SubExpression Elimination (CSEE) firstname.lastname@example.org) (1992-06-15)|
|From:||email@example.com (Wim Dijkema (afst.Dijkstra))|
|Date:||Mon, 15 Jun 1992 13:45:42 GMT|
Part of my graduation project was to implement Gries' (Gries, 'Compiler
Construction for Digital Computers', 1971, pages 379-382, 406,407) CSEE
algorithm in the Front End of the Groningen Oberon Compiler, an Oberon-2
compiler. The Front End of this compiler is machine independent. We choose
Gries' algorithm, because all CSEs found when using the algorithm of Cocke
& Schwartz, are found when using Gries' algorithm, while less memory space
is needed. This was the conclusion of a study done by R.J. Cazemier in
1989, here in Groningen. As a graduation project he discussed and proved
the CSEE methods of Davidson & Fraser, Cocke & Schwartz, Gries, and
The Oberon compiler is a retargetable compiler. The intermediate language
consist of register transfers. From some experiments we concluded that
thanks to Gries' algorithm one third of the register transfer instructions
In his article Michael Ernst discusses
> x = a + b + c + d
> y = b + c + d + e
The CSE b + c + d is not found by our compiler, but we think that these
things seldom occur. We did include tests on commutative operators. So
the algorithm will see that a + b and b + a are CSEs. But, when we omit
this test, the same number of CSEs are found.
However Gries' discusses Michael's problem, too. According to Gries,
Breuer (Breuer, 'Generation of optimal code for expressions via
factorization', CACM 12 (June 1969), pages 333 - 340) describes a method
for factoring sequences of expressions and assignments. So, the CSE b + c
+ d in Michaels example will be found using this method.
I want to know if any compilers make use of the algorithm presented by
Gries. Also I'm curious about the number of instruction eliminated due to
other CSEE algorithms, related to the total number of instruction.
Return to the
Search the comp.compilers archives again.