Analysis and optimization frameworks related work? (Jeff Dean)
12 Oct 1996 22:15:41 -0400

          From comp.compilers

Related articles
Analysis and optimization frameworks related work? (1996-10-12)
Re: Analysis and optimization frameworks related work? (1996-10-15)
Re: Analysis and optimization frameworks related work? (1996-10-15)
Re: Analysis and optimization frameworks related work? (1996-10-16)
Re: Analysis and optimization frameworks related work? (1996-10-18)
Re: Analysis and optimization frameworks related work? (Uday P Khedker) (1996-10-20)
| List of all articles for this month |

From: (Jeff Dean)
Newsgroups: comp.compilers
Date: 12 Oct 1996 22:15:41 -0400
Organization: DEC Western Research Lab
Keywords: dataflow, question

This is a request for information about related work on frameworks
that make it easier to design and implement compiler optimizations.
We're currently in the process of writing up some work we've done in
this area, and we're looking for related work on frameworks that make
it easy to specify and implement compiler analyses. We know about
some intraprocedural frameworks, such as those described in Tjiang
et. al's PLDI'91 paper, and in Adl-Tabatabai et al.'s OOPSLA'96 paper,
and we also know about some interprocedural frameworks, such as Mary
Hall's FIAT framework. However, since dataflow analyses are the
underpinning of optimizing compilers, we suspect that many existing
compilers have similar frameworks. It's possible that the details of
these frameworks have not been published, or have been published in
off-the-beaten-track places or in internal TRs. To be sure we
adequately address the related work, we'd very much appreciate hearing
about any works that you think are relevant.

To give a bit more context about our work, we've developed a new
framework to make it easy to express intra- and inter-procedural
dataflow analyses and optimizations. Essentially, we have a dataflow
analysis & transformation engine that manages the details of merging
together information, iterating when fixed point has not yet been
reached, and applying transformations that are requested by the client
analysis. By having only the analysis framework makes transformations
to the underlying intermediate representation, we obtain two nice

    o A client of the framework can perform analyses and transformations
at the same time. Traditional approaches to performing optimizations
usually involve one pass over the IR to compute some properties, and
then a second pass over the graph to perform transformations based on
this information. Our approach permits clients to optimistically
request transformations that they would like to perform before they
are certain that the information is conservative (because fixed-point
has not yet necessarily been reached): the dataflow framework takes
care of only applying these transformations at the appropriate time.

    o Multiple analyses & transformations can easily be combined
together to run in parallel when there are synergistic effects from
doing so (Click & Cooper have developed a theoretical framework that
specifies when combining multiple optimizations together is desirable;
our framework makes combining such optimizations easy to do in

I'd also appreciate any philosophical comments or anecdotes that
people might have about such frameworks..

    -- Jeff

Jeffrey Dean ( Member of Research Staff
Western Research Laboratory Digital Equipment Corporation

Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.