|Information about scripting languages email@example.com (Gianluca Silvestri) (2008-09-10)|
|Re: Information about scripting languages firstname.lastname@example.org (lican) (2008-09-16)|
|Re: Information about scripting languages email@example.com (Stephen Horne) (2008-09-17)|
|Re: Information about scripting languages firstname.lastname@example.org (jpoirier) (2008-09-18)|
|Re: Information about scripting languages email@example.com (Gianluca Silvestri) (2008-09-19)|
|From:||Stephen Horne <firstname.lastname@example.org>|
|Date:||Wed, 17 Sep 2008 18:54:32 +0100|
|Keywords:||interpreter, code, books|
|Posted-Date:||17 Sep 2008 17:56:41 EDT|
On Tue, 16 Sep 2008 19:05:59 -0700 (PDT), lican <email@example.com>
>The problem is that most of the books and works stop after
>parsing and AST and do not explain what to do exactly between AST and
Modern Compiler Design
Grune, Bal, Jacobs, Langendoen
Most recent edition was 2000 - new edition due 2010
Chapter 1 : Introduction
Chapter 2 : Lexical analysis
Chapter 3 : Parsing
Chapter 4 : AST handling and attribute grammars
Chapter 5 : Processing intermediate code
Chapter 6 : Memory management
Chapter 7 : Imperitive and Object Oriented Programs
Chapter 8 : Functional Programs
Chapter 9 : Logic Programs
Chapter 10 : Parallel and Distributed Programs
Appendix A : Simple OO compiler/interpreter
http://www.cs.vu.nl/~dick/MCD.html for more details.
Also, Amazon supports search-inside
On my first read, years ago, I struggled with LR parsing until I
downloaded "Parsing Techniques - A Practical Guide". The postscript
and pdf files are still available from
http://www.cs.vu.nl/~dick/PTAPG.html, but there's also a second
edition now available to buy. Having got past that minor hurdle, I've
always found Modern Compiler Design to be very readable. I always
meant to buy the Dragon book as well, but basically MCD and some web
research has normally been enough.
Some things I wish I'd seen explained in textbooks - can be relevant
to compilers but also more general...
Zobrist hashes : relevant to generating digraphs using closure
algorithms, e.g. building LR/regular expression state tables, decision
trees etc. Doesn't have to be a subscript for a huge hash table - can
be a key prefix for tree based data structures, for example, to
optimise average key comparison time and therefore search speed.
Point is that it's a hash that doesn't need full calculation from the
data it represents, but is calculated incrementally from previous
states, and which helps quickly identify cases where several routes
lead to the same state.
First encountered it in Games Programming Gems 4, of all places.
Example was a chess program. Haven't actually used it, but I intend to
try the idea to optimise various bits of code.
Hopcroft digraph minimisation : should be self explanatory. Why use
LALR(1), for example, when you can start with LR(1) yet potentially
optimise away more redundancy than you do by using LALR(1) without
Had to put this one together from tidbits all over the internet and
some guesswork. Found out about it from Adrian Thurston and his
program Ragel. I use it a lot, optimising decision trees for size and
(as with Ragel) to optimise auto-generated finite state models.
Return to the
Search the comp.compilers archives again.