Related articles |
---|
Beginner's Question... mihai@A-aod.resnet.ucsb.edu (Mihai Christodorescu) (1997-01-16) |
Re: Beginner's Question... salomon@silver.cs.umanitoba.ca (1997-01-16) |
Re: Beginner's Question... jlilley@empathy.com (John Lilley) (1997-01-16) |
Re: Beginner's Question... edi-c@algonet.se (Kurt Svensson) (1997-01-17) |
Re: Beginner's Question... will@ccs.neu.edu (William D Clinger) (1997-01-17) |
Re: Beginner's Question... jlilley@empathy.com (John Lilley) (1997-01-19) |
From: | Kurt Svensson <edi-c@algonet.se> |
Newsgroups: | comp.compilers |
Date: | 17 Jan 1997 23:24:38 -0500 |
Organization: | AlgoNet Public Access Node, Stockholm |
Keywords: | performance |
>John Lilley wrote:
> It's more a matter of economics than theory. High-level languages are
> productivity tools for programmers.........
> john lilley
John (and others)
Is it still source code reading and parsing the
bottleneck or what?
If, let us say, you divide compiler work into the following steps:
- Source code reading - lex
- Parsing (lex may be part of this step),
* tree build
* symbol table
* constants etc..
- Tree optimizations
* Flow analysis
* Optimizations
* ad hoc optimizations
- Code generation
* Register alloc
* Peep hole opt.
* Code gen
- Linking
What is the (average) percentage of work for each step?
- For a C compiler?
- For a C++ compiler?
Has not the importance of the parser (exclusive lex) speed declined
lately? has not not building/reading data structures became more
important? Any articles or bokks on this on this?
Regards
Kurt Svensson
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.