Related articles |
---|
Re: Writing fast compilers... pcg@aber.ac.uk (1991-08-11) |
Re: Writing fast compilers... preston@helena.rice.edu (1991-08-11) |
Re: Writing fast compilers... pardo@gar.cs.washington.edu (1991-08-13) |
Re: Writing fast compilers... davidsen@crdos1.crd.ge.com (1991-08-13) |
Re: Writing fast compilers... preston@helena.rice.edu (1991-08-13) |
Re: Writing fast compilers... alex@vmars.tuwien.ac.at (1991-08-13) |
Re: Writing fast compilers... pcg@aber.ac.uk (1991-08-14) |
Re: Writing fast compilers... markh@csd4.csd.uwm.edu (1991-08-16) |
[5 later articles] |
Newsgroups: | comp.compilers |
From: | preston@helena.rice.edu (Preston Briggs) |
Keywords: | performance |
Organization: | Rice University, Houston |
References: | <1991Aug10.132405.19359@walter.bellcore.com> <20167@helios.TAMU.EDU> 91-08-041 |
Date: | 11 Aug 91 21:13:39 GMT |
pcg@aber.ac.uk (Piercarlo Grandi) writes:
>Many "optimizing" compilers have impossibly expensive optimization
>phases; for example the SPARC and MIPS compilers have a global
>optimization algorithm with a time complexity of O(n^3), and the GNU CC
>compiler has a space complexity of O(n^2). These are well past any
>reasonable cost/benefit ratio... Clearly anything well done compares
>well against these monsters.
I just love "optimizing" compilers. Why not just drop the horror-quotes, and
_optimizing_ for that matter, and say _compilers_. If you want to talk about
compilers without optimization, say _dumb_ perhaps, or _naive_, or
_experimental_.
The complexities you mention aren't really unusual. Just doing global data
flow analysis (no optimization, just collecting information) is at least O(n)
bit vector steps, where "n" is the number of basic blocks. Since the length
of the bit vectors and the number of basic blocks are probably proportional to
the procedure length, we're already up to O(n^2) space and time.
Of course, bit vectors are pretty dense and working over basic blocks is
pretty quick, so most compilers are able to accomplish data flow analysis very
quickly, not withstanding the time complexity.
It's perhaps harder to discuss the cost/benefit ratio. I will note that I
spend much more time writing and using programs than compiling them.
>If you want to read something really interesting on the "small is
>beautiful" technology front, do a literature search for TWIG, the
>Princeton code generator tool, and its successors.
^^^^^^^^^ Bell Labs
Look in
Code Generation Using Tree Matching and Dynamic Programming
Aho, Ganapathi, and Tjiang
TOPLAS, October 1989
Preston Briggs
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.