Related articles |
---|
Intermediate Representation simon_google@mookstar.co.uk (2001-10-10) |
Re: Intermediate Representation jbeniston@siroyan.com (Jon Beniston) (2001-10-12) |
Re: Intermediate Representation vbdis@aol.com (2001-10-12) |
Intermediate Representation napi@rangkom.MY (1990-08-07) |
Re: Intermediate Representation briscoe-duke@CS.YALE.EDU (Duke Briscoe) (1990-08-08) |
Re: Intermediate Representation preston@rice.edu (Preston Briggs) (1990-08-08) |
Re: Intermediate Representation mod@westford.ccur.com (Michael O'Donnell (508)392-2915) (1990-08-09) |
Re: Intermediate Representation grover@brahmand.Eng.Sun.COM (1990-08-09) |
Re: Intermediate Representation preston@titan.rice.edu (1990-08-10) |
Re: Intermediate Representation larus@primost.cs.wisc.edu (1990-08-12) |
Re: Intermediate Representation grover@brahmand.Eng.Sun.COM (1990-08-12) |
Re: Intermediate Representation hankd@dynamo.ecn.purdue.edu (1990-08-12) |
[16 later articles] |
Newsgroups: | comp.compilers |
From: | Preston Briggs <preston@rice.edu> |
In-Reply-To: | <1990Aug07.153407.8877@esegue.segue.boston.ma.us> |
Keywords: | code, optimize, design |
Organization: | Rice University, Houston |
Date: | Wed, 08 Aug 90 17:16:40 GMT |
In article <1990Aug07.153407.8877@esegue.segue.boston.ma.us> you write:
>I would like know what people think is the best Intermediate Representation
>(IR) to be used for highly effective optimizations and code generation, and
>it should be portable.
>Examples of IRs that I know:
>(1) Abstract-syntax-tree (looks like Scheme code)
>(2) DAG
>(3) Three address code
>(4) P-code
>(5) Stanford's U-code
First, I'd classify (5) as an extension of (4) and I might group (2) with
(3).
I like low-level 3-address code (even lower level than assembly). Several
optimizing compilers that have been built using such a representation -- the
best known is the PL.8 compiler built by researchers at IBM.
I don't much like IL's based on stack machines (roughly 4 and 5), at least
for optimization purposes. Optimization wants explicit registers to hold
intermediate results.
AST's seem too high-level for doing lots of optimization. We want the
details of array accesses, etc. exposed to the optimizer.
Actually, I'll qualify this a little. I believe there are many optimizations
that can be carried out most effectively on a high-level representation (for
example, those requiring dependence analysis) and many that should be carried
out on a low level representation (e.g., CSE elimination, strength
reduction). This is the sort of approach we've taken locally.
Naturally, there are counter-examples to everything I've said. AST based
compilers include Bliss-11, the PQCC based systems, and some of the Ada
efforts. There's also Guy Steele's Rabbit compiler. Fred Chow's thesis
describes an optimizing Pascal compiler based on (5). I believe Chow's work
is also the basis for the MIPS compilers. Some version of (4) is used in
Powell's Modula-2 compiler.
Preston
[Anybody tried program dependency graphs, per Ferrante et al. at IBM. From
the articles I've seen, they look pretty neat. -John]
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.