|Looking for the GNU gcc grammar email@example.com (2003-04-13)|
|Re: Looking for the GNU gcc grammar firstname.lastname@example.org (2003-04-15)|
|Re: Looking for the GNU gcc grammar email@example.com (Daniel Berlin) (2003-04-15)|
|Re: Looking for the GNU gcc grammar firstname.lastname@example.org (Ross Bencina) (2003-04-20)|
|Re: Looking for the GNU gcc grammar email@example.com (2003-04-27)|
|From:||firstname.lastname@example.org (Michael Tiomkin)|
|Date:||27 Apr 2003 02:09:05 -0400|
|Posted-Date:||27 Apr 2003 02:09:05 EDT|
email@example.com (Bill Last Name Omitted) wrote in message news:03-04-051...
> After some preliminary source diving, I've found the expected Bison
> file which recognizes the nonterminals. However, I was surprised that
> I did not find a flex based scanner (it looks like they use ad hoc
> lexical analysis). Anyone know why they don't use flex for scanning?
The reason of not using lex is compiler performance.
Usually, a compiler spends most of compile time in lexical analyzer,
and the easiest way to improve performance is to change the tokenizer.
You are lucky that the C (and even gcc) keywords are simple and
well-known, and it is a relatively easy job to recover them.
BTW, you can remove the typedef hack from the lexer and place it
in the parser (where it should belong if you want a decent grammar).
Return to the
Search the comp.compilers archives again.