Re: grammar/dialect question

"W. Lyle Hayhurst" <wlhst6+@pitt.edu>
23 Nov 1999 13:02:17 -0500

          From comp.compilers

Related articles
grammar/dialect question wlhst6+@pitt.edu (W. Lyle Hayhurst) (1999-11-23)
Re: grammar/dialect question wlhst6+@pitt.edu (W. Lyle Hayhurst) (1999-11-23)
| List of all articles for this month |
From: "W. Lyle Hayhurst" <wlhst6+@pitt.edu>
Newsgroups: comp.compilers
Date: 23 Nov 1999 13:02:17 -0500
Organization: University of Pittsburgh
References: 99-11-136
Keywords: parse

The difference between the dialects should be small enough to fudge it
with semantic checks.


However, this raises a second question for me. I'm likely to be
updating this grammar 'on-the-fly' every time I discover a new
language variant (which can happen at any time in the lifetime of this
grammar).


So, what is the most flexible way of doing the lexing and parsing,
such that changes to the grammar do not have a big impact on the
lexical/parsing/semantic program code?


I ask this question because, once I did a LL(1) parser; I had
generated all of my look-ahead tables, then, argh, discovered I had to
change the grammar a little. Going in and changing the look-ahead
tables ended up being pretty painful.


I guess I should add that I plan on rolling my own lexer and parser
for this job.


Thanks
Lyle


On 23 Nov 1999, W. Lyle Hayhurst wrote:


> Hence, my question: can dialects be integrated into the standard
> grammar of a language, or do you have to write an entirely seperate
> grammar for each dialect?
[Not to belabor the obvious, but if you want to save yourself needless
work, don't roll your own, use a tool like yacc that generates the
tables for you, so you can just change the grammar and rebuild when
you need to. -John]



Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.