|Beginner's Question... mihai@A-aod.resnet.ucsb.edu (Mihai Christodorescu) (1997-01-16)|
|Re: Beginner's Question... email@example.com (1997-01-16)|
|Re: Beginner's Question... firstname.lastname@example.org (John Lilley) (1997-01-16)|
|Re: Beginner's Question... email@example.com (Kurt Svensson) (1997-01-17)|
|Re: Beginner's Question... firstname.lastname@example.org (William D Clinger) (1997-01-17)|
|Re: Beginner's Question... email@example.com (John Lilley) (1997-01-19)|
|From:||Kurt Svensson <firstname.lastname@example.org>|
|Date:||17 Jan 1997 23:24:38 -0500|
|Organization:||AlgoNet Public Access Node, Stockholm|
>John Lilley wrote:
> It's more a matter of economics than theory. High-level languages are
> productivity tools for programmers.........
> john lilley
John (and others)
Is it still source code reading and parsing the
bottleneck or what?
If, let us say, you divide compiler work into the following steps:
- Source code reading - lex
- Parsing (lex may be part of this step),
* tree build
* symbol table
* constants etc..
- Tree optimizations
* Flow analysis
* ad hoc optimizations
- Code generation
* Register alloc
* Peep hole opt.
* Code gen
What is the (average) percentage of work for each step?
- For a C compiler?
- For a C++ compiler?
Has not the importance of the parser (exclusive lex) speed declined
lately? has not not building/reading data structures became more
important? Any articles or bokks on this on this?
Return to the
Search the comp.compilers archives again.