Re: Best, Simple versus Best

leichter@zodiac.rutgers.edu
Tue, 21 Mar 1995 21:09:08 GMT

          From comp.compilers

Related articles
[2 earlier articles]
Best, Simple versus Best preston@tera.com (1995-03-14)
Re: Best, Simple versus Best schrod@iti.informatik.th-darmstadt.de (1995-03-15)
Best, Simple versus Best Jon.Bertoni@Eng.Sun.COM (1995-03-15)
Re: Best, Simple versus Best hbaker@netcom.com (1995-03-16)
Re: Best, Simple versus Best oz@nexus.yorku.ca (1995-03-16)
Re: Best, Simple versus Best sdm7g@elvis.med.virginia.edu (Steven D. Majewski) (1995-03-20)
Re: Best, Simple versus Best leichter@zodiac.rutgers.edu (1995-03-21)
Re: Best, Simple versus Best csabhv@upe.ac.za (Prof Herman Venter) (1995-03-30)
Best, Simple versus Best preston@tera.com (1995-03-30)
Re: Best, Simple versus Best ryg@summit.novell.com (1995-04-15)
Re: Best, Simple versus Best ryg@summit.novell.com (1995-04-07)
| List of all articles for this month |
Newsgroups: comp.compilers
From: leichter@zodiac.rutgers.edu
Keywords: optimize, design
Organization: Rutgers University Department of Computer Science
References: 95-03-050 95-03-081
Date: Tue, 21 Mar 1995 21:09:08 GMT

I don't entirely disagree with many of Preston Briggs's comments - for one
thing, I cited Powell's paper as someone else's arguments, not mine - but I
think Mr. Briggs overstates the the case in his response.


First of all, he cites the classic Gabriel paper introducing the "worse is
better" theory. I think that's a fine paper with many excellent points -
but its applicability here is limited.


Gabriel's point about the "New Jersey" school is that it goes for simplicity
in the implementation, even if that imposes either complexity or errors on
the users of that implementation. For a compiler, that corresponds to
oddball special requirements to get good code, or incorrect compilation (often
as a result of optimizations that are known to be unsafe).


I see nothing in Powell's suggestions that suggest either. His argument for
choosing simple optimization strategies is that he can get them right, not
primarily that it will let him finish his compiler faster. He also claims
that he has chosen strategies that work with "natural", well-written code,
rather than with odd-ball cases. He may be wrong, but that's very different
from deliberately pushing complexity onto the compiler's users - as many C
compilers of the same era did, requiring programmers to decide what to put in
registers, move invarient code out of loops, avoid subscripting in favor of
pointer hacking, and so on. Powell doesn't propose any of this, and in fact
much of it cannot even be specified in Modula-2.


> Everybody always likes the magic phrase "best, simple" (almost as much
> as the words "linear" and "optimal"). Why not say "I did the best I
> could" or "I built what I could decipher" versus trying to make a
> virtue out of doing a shoddy job.


Because that's not at all what Powell to do. His compiler, as least as he
describes it - I have no personal experience with it either - used some
fairly sophisticated techniques. It just chose to avoid other, even more
complex techniques. Insulting words like "shoddy" don't belong in a rational
discussion.


> I never used Powell's compiler; it was dead by the time I started grad
> school. However, I know people who did use it. They described it as
> "a pig" -- used too much memory and too many cycles for "real" code (I
> user horror quotes since these were all academic users). They had to
> employ a "coke-can semaphore" to ensure that only 1 person was
> compiling at a time (i.e., you don't compile unless you've got _the_
> coke can in your possesion).


It's impossible to know what to make of this without knowing (a) what machines
the compiler was being run on; (b) how large the systems it worked on were.
It is a fact that the *code* it produced was highly competitive with code
produced by much larger compilers using much more sophisticated techniques.


> Why? Simple algorithms (especially "simple to understand and
> implement" algorithms aren't always asymptotically efficient).
> You have to work hard if you want to find efficient and effective
> approaches to hard problems.


Againn, you're insulting Powell without, as far as I can tell, knowing what he
did or didn't do. I came away from the paper with the feeling that Powell
understood the complexity of his algorithms quite well.


It is also true that scalability for the sake of scalability is typical of
purely academic work. FORTRAN programs are often very large, and compilers
have to be able to deal with them. Individual Modula-2 compilation units
written by Modula-2 programmers who know how to use the language tend to
be quite small. Issues of scale arise, not in the compiler, but in the system
used to maintain the libraries. (A very similar issue arises today in C++
compilers. A C++ compiler that with O(n^2) performance that became noticable
for functions longer than, say, 500 lines, but with a very fast method for
dealing with pre-compiled header files would win big over a C++ compiler with
O(n log n) performance even on very large source files, but with a slow
method for dealing with header files.)


One of the questions I always ask my students is: Register allocation is
known to be NP-complete. What does that tell us about register allocation in
practical compilers? Answer: Essentially nothing.


>>This lesson has to be re-learned periodically. I suppose the intellectual
>>heir to Powell's Modula-2 today is lcc - small, fast, simple, but still
>>manages to do a much better job than expected.
>
> I think lcc is the better effort. lcc aims at being fast and makes no
> claims about optimization. BTW, their speed wasn't simple to achieve;
> they worked hard for their speed, exploring many alternatives and
> employing the results of many years of implementation experience
> (theirs and others). Powell's compiler combines several bad features
> (big (data space, not code space), slow, ineffective) while making it
> easy for the compiler writer. That's fine for the compiler writer,
> but a little hard on the user.
>
> Of course, many optimizing compilers are big and slow, and many are
> ineffective. Bummer. But most people working on optimization are
> working to improve the state of the art, not hold it in the '70s.


The reason Powell's compiler produced fast code is that he apparently put some
hard work into understanding the machines the was generating code for.


Really, I don't know the man - do you, that you are so ready to judge whether
he worked hard?


It's a fact that most of the techniques developed in any given era will turn
out, in retrospect, not to be worth it. That doesn't mean one shouldn't try
to advance - but it does mean that one shouldn't be so quick to criticize
those who pick a point well behind the state of the art and use techniques
that have already proven themselves. Powell did his work in the early 80's,
and chose to use techniques that were then 5-10 years old. I call that a
reasonable engineering choice. If someone working today chose to use
exactly the same technologies as Powell did, my opinion would be rather
different. Machines have change, languages have changed, environments have
changed - and we know more now than we did in 1975. It's easy to be clever in
retrospect.
-- Jerry
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.