|RE: 90/10 rule... source? email@example.com (Quinn Tyler Jackson) (2004-01-09)|
|RE: 90/10 rule... source? firstname.lastname@example.org (Quinn Tyler Jackson) (2004-01-12)|
|Re: 90/10 rule... source? email@example.com (2004-01-16)|
|Re: 90/10 rule... source? firstname.lastname@example.org (2004-01-22)|
|From:||Quinn Tyler Jackson <email@example.com>|
|Date:||12 Jan 2004 11:56:23 -0500|
|Posted-Date:||12 Jan 2004 11:56:23 EST|
> I've wondered if there are any other* parser generators out there that can
> profile at the production level.
> * I say "other" because Meta-S does this.
> [I think I've seen a profiling version of yacc, but it was a long time
> ago. Unless a compiler does a great deal of analysis and
> optimization, the lexer is usually the part of the program that eats
> up the most time. I don't ever recall a parser that took much of the
> overall runtime. -John]
I added profiling into Meta-S because the adaptive(k) algorithm used
to match an $-grammar against input can make the grammar behave
somewhat like a "program", and certain productions can be optimized if
one knows they are the productions that take the most cycles to accept
or fail to accept input. I haven't done any studies on whether or not
the 90/10 rule applies to sophisticated grammars, though.
It might be interesting to run the C++, Perl, and C# grammars (that
is, the "larger" grammars) against test input and determine if the
grammars tend to spend most of their time in so few of the rules.
Quinn Tyler Jackson
Return to the
Search the comp.compilers archives again.