Related articles |
---|
Flex vs Cocktail "Rex" ariadne@access.mbnet.mb.ca (1996-11-14) |
Re: Flex vs Cocktail "Rex" clark@quarry.zk3.dec.com (1996-11-21) |
From: | clark@quarry.zk3.dec.com (Chris Clark USG) |
Newsgroups: | comp.compilers |
Date: | 21 Nov 1996 23:13:10 -0500 |
Organization: | Digital Equipment Corporation - Marlboro, MA |
References: | 96-11-093 |
Keywords: | lex, comment |
> My scanner is for PL/I, with both very complex tokens, and over 400
> case-insensitve keywords. The Rex "tunnel automaton" does a very good
> job on this scanner, and Herr Grosch's papers and examples would lead
> me to expect that "flex" may not genarate an efficient automaton for
> this type of scanner.
If you do your keyword processing as part of symbol table lookup,
[John, please put a pointer into the discussion on that topic.] then
the issue is moot. You can use nearly any scanner generator and get
reasonable efficiency. (Provided that the sanner generator has some
mechanism for user overriding of the token types based on the lookup
results. I know that this is doable in flex.)
However, if you need to have your keywords processed as part of the
scanning and cannot use symbol table lookup, then the "tunnel
automata" approach produces very space efficient tables with only a
limited run-time penalty, adding roughly only the overhead of one if
check per table.
-Chris
[The question of putting keywords in the lexer vs. treating them as symbols
and looking them up has indeed come up. The archives are all at
http://iecc.com/compilers, with full text search. -John]
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.