|Impact of large yacc grammar on optimizations firstname.lastname@example.org (1996-06-26)|
|Re: Impact of large yacc grammar on optimizations email@example.com (1996-06-30)|
|From:||firstname.lastname@example.org (Colm McHugh)|
|Date:||26 Jun 1996 11:42:54 -0400|
|Keywords:||yacc, C, optimize, question|
I have a question about what C compiler optimizations get lost when
compiling an enormous piece of code.
For example, I have a large yacc grammar with a lot of actions, and
yacc produces a yyparse function that is 10626 lines, with a switch
statement that is 10204 lines long, and has 986 cases.
I have heard tell that certain optimizations will not be done by the
C compiler when dealing with such a large piece of code. Obviously, it
will differ from compiler to compiler, but I would like to get an idea
of what kind of optimizations go out the window.
I'm considering switching to PCCTS, which generates a function for each
rule in the grammar. I think that PCCTS is a much better tool to work
with than yacc, but for small grammars (~30-40 productions) yacc generates
faster parsers. However, as more productions and code are added to a yacc
grammar, there may be a performance degradation in the generated parser,
due to the large amount of code produced. I don't believe PCCTS would
suffer from this problem, because it produces a separate function for
Anyways, just to re-cap, I would like to get an idea of the optimizations
that may be lost or how performance is impacted with a large yacc grammar,
that produces code with features like those mentioned above.
Help/advice/suggestions very much appreciated,
Return to the
Search the comp.compilers archives again.