Best, Simple versus Best

preston@tera.com (Preston Briggs)
Tue, 14 Mar 1995 02:14:32 GMT

          From comp.compilers

Related articles
Re: Optimizing Across && And || leichter@zodiac.rutgers.edu (1995-03-07)
Best, Simple versus Best preston@tera.com (1995-03-14)
Best, Simple versus Best preston@tera.com (1995-03-14)
Re: Best, Simple versus Best schrod@iti.informatik.th-darmstadt.de (1995-03-15)
Best, Simple versus Best Jon.Bertoni@Eng.Sun.COM (1995-03-15)
Re: Best, Simple versus Best hbaker@netcom.com (1995-03-16)
Re: Best, Simple versus Best oz@nexus.yorku.ca (1995-03-16)
Re: Best, Simple versus Best sdm7g@elvis.med.virginia.edu (Steven D. Majewski) (1995-03-20)
[5 later articles]
| List of all articles for this month |

Newsgroups: comp.compilers
From: preston@tera.com (Preston Briggs)
Keywords: optimize, design
Organization: Compilers Central
References: 95-03-050
Date: Tue, 14 Mar 1995 02:14:32 GMT

>For a classic look at the opposite approach, you might want to [look at]


    title="A Portable Optimizing Compiler for {Modula-2}",
    author="Michael L. Powell",
    pages="310--318",
    journal=sigplan,
    year=1984,
    month=jun,
    volume=19,
    number=6,
    note=pldi84


>Powell's approach to the optimizer was what he called "best simple": Do the
>best job you can while keeping the optimizer itself simple and easy to under-
>stand (and thus get right). He rejects many of the published algorithms as
>too complex, too hard to understand, and too unlikely to be useful in most
>programs. The basic design used in Powell's he credits to papers dating back
>to 1970.


Everybody always likes the magic phrase "best, simple" (almost as much
as the words "linear" and "optimal"). Why not say "I did the best I
could" or "I built what I could decipher" versus trying to make a
virtue out of doing a shoddy job.


I never used Powell's compiler; it was dead by the time I started grad
school. However, I know people who did use it. They described it as
"a pig" -- used too much memory and too many cycles for "real" code (I
user horror quotes since these were all academic users). They had to
employ a "coke-can semaphore" to ensure that only 1 person was
compiling at a time (i.e., you don't compile unless you've got _the_
coke can in your possesion).


Why? Simple algorithms (especially "simple to understand and
implement" algorithms aren't always asymptotically efficient).
You have to work hard if you want to find efficient and effective
approaches to hard problems.


For example, the community isn't interested in SSA because they like
to type "$\phi$-node" all over their papers; they're interested
because SSA form provides an efficient (sparse, factored)
representation of several important relationships. Sure, you can do
some of the same things with bit vectors (and where you can, it's
simpler to build), but they're big and slow, especially for large
programs.


>This lesson has to be re-learned periodically. I suppose the intellectual
>heir to Powell's Modula-2 today is lcc - small, fast, simple, but still
>manages to do a much better job than expected.


I think lcc is the better effort. lcc aims at being fast and makes no
claims about optimization. BTW, their speed wasn't simple to achieve;
they worked hard for their speed, exploring many alternatives and
employing the results of many years of implementation experience
(theirs and others). Powell's compiler combines several bad features
(big (data space, not code space), slow, ineffective) while making it
easy for the compiler writer. That's fine for the compiler writer,
but a little hard on the user.


Of course, many optimizing compilers are big and slow, and many are
ineffective. Bummer. But most people working on optimization are
working to improve the state of the art, not hold it in the '70s.


Preston Briggs


--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.