Re: Backtracking yacc

sasghm@unx.sas.com (Gary Merrill)
Fri, 25 Sep 1992 19:59:23 GMT

          From comp.compilers

Related articles
[10 earlier articles]
Re: Backtracking yacc ipser@solomon.technet.sg (1992-09-19)
Re: Backtracking yacc andrewd@cs.adelaide.edu.au (1992-09-21)
Re: Backtracking yacc sasghm@unx.sas.com (1992-09-21)
Re: Backtracking yacc sasghm@unx.sas.com (1992-09-23)
Re: Backtracking yacc schrod@iti.informatik.th-darmstadt.de (1992-09-23)
Re: Backtracking yacc andrewd@cs.adelaide.edu.au (1992-09-25)
Re: Backtracking yacc sasghm@unx.sas.com (1992-09-25)
Re: Backtracking yacc neitzel@ips.cs.tu-bs.de (1992-09-25)
| List of all articles for this month |

Newsgroups: comp.compilers
From: sasghm@unx.sas.com (Gary Merrill)
Organization: SAS Institute Inc.
Date: Fri, 25 Sep 1992 19:59:23 GMT
Originator: sasghm@theseus.unx.sas.com
Keywords: parse, LL(1), errors
References: 92-09-059 92-09-174

andrewd@cs.adelaide.edu.au (Andrew Dunstan) writes:


It appears in general that we share the same view of the advantages of
LL parsers.


|> I don't agree with Gary that writing LL grammars is more "natural" than
|> writing LR grammars. But I certainly find making them do semantic actions
|> is more natural, and that seems to me to be more important.


On reflection I decided that this was a stupid thing for me to say.
I do agree, however, about the naturalness of semantic actions.


|> The silly thing about parser wars is that they concentrate on the wrong
|> thing. Parsing correct input is easy. It's also elegant and fun. But the
|> places where we need most work done are the hard parts - things like error
|> recovery and good code generation. Parsers should be chosen according to
|> their ability to help us do these hard things, not the easy bits.


Yes. This is *extremely* important in a production tool used by a
wide variety of programmers. If you are actually selling your
product you need to consider the weight it will place on your tech
support personnel, how easily diagnostics can be documented, etc..
The granularity of your diagnostics is an important factor. "Syntax
error" just doesn't make it in the real world. And great effort
should be expended to avoid cascading errors (or the cheap
alternative, which is "terminate on first error"). I have put quite
a bit of effort into our yacc-based C++ parser in pursuit of these
goals, but I have never felt in control of things as I do with the
recursive descent parser for our C compiler. There's some invisible
demon who keeps doing things just a little too quickly, taking things
just one step further than I want, ...


Of course one might say that this just argues for *better*
parser-generating tools. Certainly to a degree this is true, but as
you point out, there are still some fundamental problems.


I have to say that I am very suspicious of automatic error recovery
in general. It seems to be either inadequate or too expensive in
terms of efficiency. To most programmers I think it is less
important that a compiler diagnose all and only genuine errors that
it is that it diagnose a bunch of genuine errors (and no spurious
ones) *fast*. It seems to me that with a heavy-duty error recovery
scheme it would take less time to do several compile/edit/compile
cycles than to wait for compiler to figure everything out just right.
I may be wrong, but the error recovery schemes I have seen are pretty
expensive.


--
Gary H. Merrill [Principal Systems Developer, C Compiler Development]
SAS Institute Inc. / SAS Campus Dr. / Cary, NC 27513 / (919) 677-8000
sasghm@theseus.unx.sas.com ... !mcnc!sas!sasghm
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.