Re: Analysis and optimization frameworks related work?
18 Oct 1996 08:43:36 -0400

          From comp.compilers

Related articles
Analysis and optimization frameworks related work? (1996-10-12)
Re: Analysis and optimization frameworks related work? (1996-10-15)
Re: Analysis and optimization frameworks related work? (1996-10-15)
Re: Analysis and optimization frameworks related work? (1996-10-16)
Re: Analysis and optimization frameworks related work? (1996-10-18)
Re: Analysis and optimization frameworks related work? (Uday P Khedker) (1996-10-20)
| List of all articles for this month |

Newsgroups: comp.compilers
Date: 18 Oct 1996 08:43:36 -0400
Organization: Universitaet Karlsruhe
References: 96-10-044
Keywords: tools, optimize, bibliography (Jeff Dean) writes:
|> This is a request for information about related work on frameworks
|> that make it easier to design and implement compiler optimizations.

You may distinguish between compiler construction frameworks that
support practical works, generators of algorithms and theoretical
analysis frameworks.

1) Compiler frameworks:

- CoSy

On the principles of Cosy there are a lot of internal reports.
Ask ACE.
- Sage++: (Indiana University, IRISA) C++-oriented framework
    for instruction scheduling and other optimizations

2) Generators of algorithms:

- PAG (Martin Alt, University Saarbruecken)
    interprocedural analysis and abstract interpretation
- Optimix (myself,
    intraprocedural analysis and transformation
    PAG and Optimix work standalone, but also with the CoSy compiler
- Sharlit (Tijang, Suif,
    only bitvector problems (?)
- MetaFrame (Steffen, Knoop, University Passau) based on modal logic
- SPARE (Venkatesh) based on abstract interpretation

3) theory frameworks
- Cousot's abstract interpretation framework (especially Francois
Bourdoncle's work on Syntox, the abstract debugger)
- Jens Knoop's thesis on the 'interprocedural coincidence theorem',
which states conditions when intraprocedural analysis algorithms
can be used for interprocedural problems

|> o A client of the framework can perform analyses and transformations
|> at the same time. Traditional approaches to performing optimizations
|> usually involve one pass over the IR to compute some properties, and
|> then a second pass over the graph to perform transformations based on
|> this information. Our approach permits clients to optimistically
|> request transformations that they would like to perform before they
|> are certain that the information is conservative (because fixed-point
|> has not yet necessarily been reached): the dataflow framework takes
|> care of only applying these transformations at the appropriate time.

Have you compared this to transformational attribute grammars, i.e.
higher order attribute grammars? Also AGs compute
the time and place where to transform, although they should be
restricted to trees. There is work from D. Swierstra, M. Jourdan, and
J. Grosch on that. Please look in the bib-servers.

@InProceedings{ assmann.94b,
    author = {Alt, M. and A\3mann, U. and van~Someren, H.},
    title = "{Cosy Compiler Phase Embedding with the CoSy Compiler
    booktitle = {Compiler Construction (CC)},
    series = "Lecture Notes in Computer Science",
    volume = 786,
    year = {1994},
    editor = {Fritzson, P. A.},
    pages = {278--293},
    publisher = Springer,
    address = Heidelberg,
    month = {April},
    note = {},
    annote = {ua.}
@TechReport{ assmann.96c,
    author = "A\3mann, Uwe",
    title = "{OPTIMIX Language Manual (for OPTIMIX 2.0)}",
    year = 1996,
    institution = "INRIA Rocquencourt",
number = "RT-0195",
annote = "",
abstract = "This is the language manual for OPTIMIX, the optimizer
generator. OPTIMIX can be used to generate program analyses
and transformations. Its input language is based on DATALOG
and graph rewriting. Especially two new classes of graph
rewrite systems are used: edge addition rewrite systems
(EARS) and exhaustive graph rewrite systems (XGRS). ",
    annote = {ua.}
@InProceedings{ alt.95a,
    author = "M. Alt and F. Martin",
    title = "{Generation of efficient interprocedural analyzers with
    editor = "A. Mycroft",
    series = "Lecture Notes in Computer Science",
    booktitle = "Static Analysis Symposium",
    year = 1995,
    publisher = Springer,
    note = "to appear",
    address = Heidelberg
    author = "{Francois} Bourdoncle",
    title = "Abstract Debugging of Higher-Order Imperative
    pages = "46--55",
    journal = sigplan,
    year = "1993",
    month = jun,
    volume = "28",
    number = "6",
    note = pldi93,
@InProceedings{ bourdoncle.93a,
    author = "F. Bourdoncle",
    title = "Efficient chaotic iteration strategies with widenings",
    series = "Lecture Notes in Computer Science",
    volume = 735,
    pages = "128-141",
    booktitle = "{Proceedings of the International Conference on Formal
                                    Methods in Programming and their Applications}",
    year = 1993,
    publisher = Springer

@InProceedings{ knoop.92b,
    author = "Knoop, J. and Steffen, B.",
    title = "The Interprocedural Coincidence Theorem",
    booktitle = {Compiler Construction (CC)},
    editor = "U. Kastens and P. Pfahler",
    year = 1992,
    address = Heidelberg,
    publisher = Springer,
    volume = 641,
    series = "Lecture Notes in Computer Science",
    month = oct,
    pages = "125-140"
@PhDThesis{ knoop.93a,
    author = {Knoop, J.},
    title = { Optimal Interprocedural Program Optimization: {A} new
framework and its application},
    school = "{Universit\"at Kiel}",
    year = 1993

Best regards

Dr. Uwe Assmann * University of Karlsruhe, IPD
Vincenz-Priessnitz-Str. 3 * 76128 Karlsruhe, Germany
tel: +49/721/608-4763 * email:

Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.