Related articles |
---|
[5 earlier articles] |
Re: Having trouble finding texts on compiler construction. sh006d3592@blueyonder.co.uk (Stephen Horne) (2009-04-20) |
Re: Having trouble finding texts on compiler construction. jthorn@astro.indiana.edu (Jonathan Thornburg) (2009-04-20) |
Re: Having trouble finding texts on compiler construction. torbenm@pc-003.diku.dk (2009-04-21) |
Re: Having trouble finding texts on compiler construction. bear@sonic.net (Ray Dillinger) (2009-04-22) |
Re: Having trouble finding texts on compiler construction. DrDiettrich1@aol.com (Hans-Peter Diettrich) (2009-04-24) |
Re: Having trouble finding texts on compiler construction. ryan.mccoskrie@gmail.com (Ryan McCoskrie) (2009-04-26) |
Re: Having trouble finding texts on compiler construction. gneuner2@comcast.net (George Neuner) (2009-04-27) |
From: | George Neuner <gneuner2@comcast.net> |
Newsgroups: | comp.compilers |
Date: | Mon, 27 Apr 2009 13:53:46 -0400 |
Organization: | A noiseless patient Spider |
References: | 09-04-015 09-04-043 09-04-050 09-04-065 |
Keywords: | books |
Posted-Date: | 28 Apr 2009 05:11:48 EDT |
On Sun, 26 Apr 2009 11:16:51 +1200, Ryan McCoskrie
<ryan.mccoskrie@gmail.com> wrote:
>> Ryan McCoskrie <ryan.mccoskrie@gmail.com> writes:
>>>
>>> I'm trying to find some good text on compiler construction that don't
>>> explain things in terms of visual information (trees etcetera) and
>>> maths(which boils down to processing visual information).
>>
>I can follow pretty algorithms in pretty much anything Fortran-esque
>so long as I know _why_ things are the way they are and I'm begining
>to get my head around Lisp (though I had to find my own syntax orientated
>way to learn it).
>
>So yes linear syntax helps a great deal but the real issue is the approach
>to all of this.
The "how" can usually be described with text or, at worst, some kind
of pseudo code. Unfortunately, it is the "why" that frequently
requires a graphic to explain.
Lexing and parsing technology are based on various types of automata.
Automata are nearly always explained with pictures, either literal
images or by denotational math. However, you don't really need to
understand automata unless you want to create tools like yacc.
Creating your own parsers does _not_ require an understanding of
automata, but rather only familiarity with (E)BNF notation and a basic
knowledge of language theory - what it means for syntax to be "left"
or "right" recursive, lookahead, precedence and associativity - all of
which you should be able to get from the explanatory text.
The bulk of compiler technology is based on graph, tree and set
algorithms - connectedness, tracing/searching, flow, fixed points,
domination, covering, coloring, etc. You will need compiler texts to
understand the specifics of applying them to compiling, but the
algorithms themselves (and the theory behind them) are generic
knowledge that you can acquire any way that you find suitable.
[Not that I can help you much with that - every algorithm book on my
shelf is picture centric by your definition.]
Since you are just starting out, two classic books you might want to
look at are
Abelson & Sussman, "Structure and Interpretation of Computer
Programs"
Friedman, Wand & Haynes, "Essentials of Programming Languages"
usually referred to as "SICP" and "EOPL". Although these books are
sometimes used to teach general programming concepts, their subject
matter is really language design and interpretation. They are both
long on code (both real and pseudo) and explanatory text, and
relatively short on pictures. Because interpretation is a close
relative of compilation - compilation can be viewed as automated
construction of an interpretation - you might pick up some valuable
information from reading these.
George
Return to the
comp.compilers page.
Search the
comp.compilers archives again.