Re: Beginner's Question...

John Lilley <jlilley@empathy.com>
19 Jan 1997 21:49:45 -0500

          From comp.compilers

Related articles
Beginner's Question... mihai@A-aod.resnet.ucsb.edu (Mihai Christodorescu) (1997-01-16)
Re: Beginner's Question... salomon@silver.cs.umanitoba.ca (1997-01-16)
Re: Beginner's Question... jlilley@empathy.com (John Lilley) (1997-01-16)
Re: Beginner's Question... edi-c@algonet.se (Kurt Svensson) (1997-01-17)
Re: Beginner's Question... will@ccs.neu.edu (William D Clinger) (1997-01-17)
Re: Beginner's Question... jlilley@empathy.com (John Lilley) (1997-01-19)
| List of all articles for this month |

From: John Lilley <jlilley@empathy.com>
Newsgroups: comp.compilers
Date: 19 Jan 1997 21:49:45 -0500
Organization: Nerds for Hire, Inc.
References: 97-01-141
Keywords: practice, performance

Kurt Svensson wrote:
> Is it still source code reading and parsing the bottleneck or what?


I think that today in C++, the bottleneck is template expansion, at
least on the compiler that I use (MSVC++), because it has the
"inclusion" model of templates. If/when someone implements the
"export" keyword for templates, I think that something other than
templates might emerge as the culprit.


Without templates, it seems that the work of lexing/parsing/analyzing
the zillion lines of header file that are included for each source
file is clearly the big hit. This doesn't break down per-task, but I
separate it out because precompiled headers can mitigate all the tasks
associated with header file processing to some degree.


> Has not the importance of the parser (exclusive lex) speed declined lately?


I think yes, the raw recognition of tokens and syntactic patterns has
beome less important, except where the ratio of non-code-producing
header file inclusion to code-producing source file size is large (as
it often is). In a language like C++, a large amount of time is spent
managing the symbol table and the scope hierarchy, and the synactic
phase gets muddled with the semantic analysis.


The conclusion that raw lexing/syntax is unimportant is supported by
the observation that compiling 100k lines of C is about five times
faster than compiling 100k lines of C++, so the extra time is
certainly not spent on raw lexing and syntactic recognition.


john
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.