From: | Hans-Peter Diettrich <DrDiettrich1@aol.com> |
Newsgroups: | comp.compilers |
Date: | 5 Aug 2006 21:54:39 -0400 |
Organization: | Compilers Central |
References: | 06-06-044 06-06-055 06-07-023 06-07-031 06-07-109 06-07-114 06-08-013 06-08-019 |
Keywords: | parse, performance |
Posted-Date: | 05 Aug 2006 21:54:38 EDT |
Gabriel Dos Reis wrote:
> | IMO it's neither the parser nor the lexer, instead it's the time spent
> | in preprocessing the source files.
>
> Having spent time on the matter, and having seen fellow contributors
> work on improving the situation, I have to disagree.
Do you take macro expansion into account, or only something like
precompiled headers?
> | I also don't know what impact namespaces have on the compilation of such
> | an source text.
>
> Basically, one has to do some minimal lookup to parse C++,
> i.e. distinguish between type-names, template-names, namespace-names,
> and other kind of names. The elaborated name lookup rules (partly
> introduced by namespaces) compounds the issue. From my measurements,
> non-negligible amount of time taken by the parser can be credited to
> name lookup.
My primary interest is the creation of the namespace tables. Are these
built only from explicitly #included files? IIRC the #using directive
doesn't allow for file names, so what's the source for the import of
the mentioned namespaces?
DoDi
Return to the
comp.compilers page.
Search the
comp.compilers archives again.