Re: Aggressive optimization

Ira Baxter <baxter@zola.ICS.UCI.EDU>
23 Oct 90 19:01:06 GMT

          From comp.compilers

Related articles
Re: Aggressive optimization brnstnd@kramden.acf.nyu.edu (1990-10-18)
Re: Aggressive optimization eerke@cs.kun.nl (1990-10-19)
Re: Aggressive optimization burley@world.std.com (1990-10-20)
Re: Aggressive optimization baxter@zola.ICS.UCI.EDU (Ira Baxter) (1990-10-23)
Re: Aggressive optimization daveg@near.cs.caltech.edu (1990-10-23)
| List of all articles for this month |
Newsgroups: comp.compilers
From: Ira Baxter <baxter@zola.ICS.UCI.EDU>
Keywords: optimize, interactive, visions
Organization: Compilers Central
Date: 23 Oct 90 19:01:06 GMT

There has been considerable discussion of optimization using "symbolic
algebra" techniques:


>In article <1990Oct18.223231.22315@esegue.segue.boston.ma.us> brnstnd@kramden.acf.nyu.edu (Dan Bernstein) writes:


> Does anyone have a compiler that can introduce a non-linear intermediate
> expression and reduce around it? If so, I'm impressed. How advanced is
> the symbolic algebra system included in the compiler? Can it take
> advantage of range constraints, so that if it knows that x is between 0
> and 7, it might set up a table to calculate ((1<<(x+2))-1)/10 quickly?
> Can it manipulate floor and ceiling expressions? Can it introduce
> invariants to figure out range constraints? These are all part of that
> single, fundamental optimization technique.


> I know, compilers are improving. Some optimizers fully automate the
> dullest, most machine-dependent part of optimization---viz., figuring
> out how often loops or branches are executed in practice, seeing how
> long machine instructions take, and deciding on that basis whether a
> given optimization is worthwhile. I really appreciate that. I won't stop
> hand optimization because of it.


An approach to "optimizing" is that of transformation systems: Use a tool
based solely on "symbolic algebra", usually called a set of
correctness-preserving transformations (cpts), to modify the source or to
refine it to more-machine like representations. This process is intended to
be semi-automatic; a very good overview article can be found in
Balzer85a:TI-perspective. The semi- part comes about because people will
always know more about their problem than any fixed set of cpts will, and so
interaction is necessary to tell the system that certian optimizations are
legal for this program.


Recent transformation systems are in fact founded on the notion that the
software engineer, as part of his duties, should in fact formally define such
algebras as a basis for the transformations used. Check out the CIP
(Bauer:CIP-volume) and PROSPECTRA (Krieg-Bruckner88a:PROSPECTRA-overview)
systems. In these systems, the SE can actually compose algebraic facts to
make transformations specific to his purposes.


Hand-optimizations are nice, but mean that the original abstract program gets
lost (come, now, you didn't *really* keep the original source did you?) and
later modifications become that much more difficult. The transformational
approach to this is to generate some sequence of transformations, some
automatically, some by hand, and to keep the sequence for later replay when
the specification is modified. This is currently a hot research topic.


Playing the perfect shill <BURLEY.90Oct20081629@world.std.com>
  burley@world.std.com (James C Burley) writes:
>[...visions about an interactive compiler...]
>Further, this system would keep information on the program in a persistent
>fashion so the information would not have to be entered again and again. To
>do that, it would be best for the system to serve as the "source editor", and
>be capable of recognizing when its current categorizations of a given chunk
>of code (at whatever level of granularity) were no longer applicable due to
>changes in the code, and forwarding information on changes to other chunks of
>code that held a "vested interest" in the changed code.


>I think this approach may be one of the more fundamental changes in
>optimization technology over the next decade or so: not because it introduces
>any sexy new optimizations per se, but because it permits programmers to
>easily pick apart their own creations and put them back together, without
>having to rewrite the source code if desired. Such a system might allow the
>industry to move back to powerful languages that "hide" potent but often
>expensive operations in innocuous-looking constructs (a problem with
>lanaguges like PL/I that C lovers like to point out, and which is valid on a
>practical level), since the expense would be easily discoverable during the
>"optimization discovery" process offered by the proposed system.


With that modest :-} :-} introduction, I introduce my own work, which
effectively does just this; see Baxter90a. In particular, it takes
specifcation deltas (source changes), and determines how those changes affect
the set of optimizations already applied, retaining those that it can,
dropping those that conflict.


Now, having raised your hopes enormously, I must also dash them a bit. This
is research; I have pursued the ideas and demonstrated that they work on
*very* small examples. It will take several years to seriously scale up.


@phdthesis(Baxter90a:thesis-transformational-maintenance,
      title = "Transformational Maintenance by Reuse of Design Histories",
      author = "Ira D. Baxter",
      department = "Information and Computer Science Department",
      school = "University of California at Irvine",
      month = nov, <<<< note this
      year = 1990,


@article(Balzer85a:TI-perspective,
      title = "{A 15 Year Perspective on Automatic Programming}",
      author = "Robert Balzer",
      journal = "IEEE Transactions on Software Engineering",
      volume = "SE-11",
      number = "11",
      month = nov,
      year = 1985,
      pages = "1257-1268",


@book(Bauer87a:CIP-volume,
      author = "F. L. Bauer and H. Ehler and A. Horsch and B. Moller
                          and H. Partsch and O. Paukner and P. Pepper",
      title = "{The Munich Project CIP}",
      publisher = "Springer-Verlag",
      year = 1987,
      note = "Lecture Notes in Computer Science 292",


@inproceedings(Krieg-Bruckner88a:PROSPECTRA-overview,
    author = "Bernd Krieg-Bruckner",
    title = "{The {PROSPECTRA} Methodology of Program Development}",
    booktitle = "{IFIP/IFAC Working Conference on Hardware and Software
for Real Time Process Control}",
    publisher = "North-Holland, New York",
    address = "Warsaw, Poland",
    pages = "1-15",
    year = 1988,
--
Ira Baxter
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.