|GLA email@example.com (1989-02-16)|
|Date:||Thu, 16 Feb 89 18:06:05 MST|
|From:||firstname.lastname@example.org (Robert Gray )|
Recent discussions of lexical analysis have focused on the problem
of scanning speed. Both Flex and GLA generate very fast scanners.
GLA-generated scanners are slightly faster, but Flex has the
advantage of being compatible with lex. In this note I'd like to
emphasize what I consider GLA's more significant contribution -
that of encapsulating knowledge of lexical analysis.
GLA provides a simple, yet powerful, interface to a regular expression
translator. As an analogy, consider the troff macro packages -me,
-ms and -man. They simplify writing papers by encapsulating knowledge
of how to begin paragraphs, build numbered lists, ... etc. Most
troff users avoid the horrendous task of dealing with troff directly
by using one of these macro packages. In addition to simplifying
the user's task, they provide standard formats that increase the
uniformity of the documents produced using them. (This is particularly
true of -man.)
Likewise, GLA and its libraries have captured knowledge about
lexical analysis. For example, the GLA macro C_STRING_LITERAL
arranges for proper scanning and processing of legal C strings
and flags illegal ones. This includes the tedious work of handling
arbitrary length, multi-line strings, escaped embedded double quotes,
null strings, and converting strings with escape sequences such
as \\, \r and \013.
Coupling GLA with libraries for string storage, identifier
hash tables and error reporting (all of which are available
from the compiler group at the University of Colorado)
allows compiler writers to spend their time on problems that
have not yet been solved.
Bob Gray (email@example.com)
Computer Science Dept 430
University of Colorado
Boulder Co 80309-0430
Return to the
Search the comp.compilers archives again.