Related articles |
---|
Re: State of the Art peter.deussen@fokus.fraunhofer.de (Peter) (2008-07-21) |
RE: State of the Art quinn_jackson2004@yahoo.ca (Quinn Tyler Jackson) (2008-07-21) |
Re: State of the Art ademakov@gmail.com (Aleksey Demakov) (2008-07-23) |
Re: State of the Art anton@mips.complang.tuwien.ac.at (2008-08-03) |
From: | "Quinn Tyler Jackson" <quinn_jackson2004@yahoo.ca> |
Newsgroups: | comp.compilers |
Date: | Mon, 21 Jul 2008 09:36:16 -0700 |
Organization: | Compilers Central |
References: | 08-07-040 |
Keywords: | practice, parse |
Posted-Date: | 21 Jul 2008 18:36:09 EDT |
> > I'd prefer PEG, which also establishes a defined order for
> ambiguous cases.
> >
> > DoDi
>
> Hi all,
>
> thanks for the answers so far. To keep track, here's a nice list of
> compiler generator tools:
> http://en.wikipedia.org/wiki/Comparison_of_parser_generators.
>
> PEG is classified as a "recursive decent" parser generator. A quick
> scan of the reference http://pdos.csail.mit.edu/~baford/packrat/popl04/
> has given me the info that this type of parsers is capable of
> accepting non-contextfree languages.
Peter's original question was:
> - In your opinion, what are the greatest advances in compiler
> construction in the last ten years?
First - the disclaimer: I'm a parser generator author who has developed a
proprietary parsing algorithm, so my opinion may be biased.
Any parser generator notation that allows for the intersection of lexemes
can recognize a subset of Type 1 languages. PEGs also recognize a subset of
Type 1 languages without intersection (predicates). (q.v.
http://compilers.iecc.com/comparch/article/05-09-108). IMO, the major
contribution of PEGs is not the context-sensitive possibilities it allows
for, but the re-introduction of memoization into parsing algorithms and the
subsequent avoidance of backtracking and its consequences on complexity. I
say this after some consideration, but my reasoning won't fit** in the
margin here.
For other interesting work, you might look into Okhotin's
Boolean/Conjunctive grammars. They also allow for context sensitivity,
and IMO, in a more orderly fashion than PEGs. Okhotin is a meticulous
mathematician before all else, and the practical results he has
obtained are quite thoroughly examined in terms of the theory behind
them.
My personal favorite ... adaptive grammars and parsers. This
particular area has come a long way in the last 10 years.
Relevant links in that area:
http://en.wikipedia.org/wiki/Adaptive_grammar
http://www.pcs.usp.br/~lta/union/index.php?cp=4
Using adaptive techniques, I've yet to come up against a
context-sensitive language that isn't in practice recognizable in less
than O(n^3) time, with some very interesting results (such as
pseudoknots) being linear.
In terms of compiler construction practice, in my opinion, the most
important trend has been the move toward virtual machine
backends. This started with Java and has been going strong with the
CLR of the .NET family of languages. As it stands, my cell phone is
powered by Java ... in 5 years, my toaster may be powered by C#. I see
this trend as being important because certain practice in compiler
construction will have more impact than previously. For instance, new
optimization techniques will benefit a wider range of applications.
Others' mileage will likely vary.
Cheers,
--
Quinn Tyler Jackson
** But two for instances: I believe Ford's PEGs led to the introduction of
memoization into ANTLR, and since ANTLR is used in a non-laboratory sense,
the impact there is on actual practice. It also led to the introduction of
memoization into the Meta-S parsing engine.
Return to the
comp.compilers page.
Search the
comp.compilers archives again.