Re: And speaking of fast compilers...

preston@dawn.cs.rice.edu (Preston Briggs)
Tue, 17 Nov 1992 02:14:59 GMT

          From comp.compilers

Related articles
And speaking of fast compilers... pardo@cs.washington.edu (1992-11-12)
Re: And speaking of fast compilers... sasdrf@unx.sas.com (1992-11-16)
Re: And speaking of fast compilers... preston@dawn.cs.rice.edu (1992-11-17)
Re: And speaking of fast compilers... cheryl@gallant.apple.com (1992-11-17)
Re: And speaking of fast compilers... pardo@cs.washington.edu (1992-11-17)
Re: And speaking of fast compilers... pardo@cs.washington.edu (1992-11-23)
Re: And speaking of fast compilers... macrakis@osf.org (1992-11-24)
Re: And speaking of fast compilers... preston@miranda.cs.rice.edu (1992-12-03)
| List of all articles for this month |

Newsgroups: comp.compilers
From: preston@dawn.cs.rice.edu (Preston Briggs)
Organization: Rice University, Houston
Date: Tue, 17 Nov 1992 02:14:59 GMT
References: 92-11-057 92-11-084
Keywords: performance, Ada, design

sasdrf@unx.sas.com (Dave Frederick) writes:


>One of the hot topics when I last worked on an Ada compiler was doing the
>above range and bounds checking optimization across procedures in a
>package. Interprocedural analysis is quite helpful for determining the
>possibility of these errors on var parameters. Of course, with a 7000-line
>package of page-sized procedures, we could be talking about 200-300
>procedures on which to perform inter-procedural analysis. This could be a
>week's worth of work.


Depends a lot on the exact formulation of the problem. Some approachs to
problems like interprocedural constant propagation, aliasing, etc. have
worked out very well. Others are NP-complete.


Back to the fast compiler idea though...


One idea that hasn't been mentioned was explored by Rick Bubenik in his
thesis -- optimistic computation (where explored optimistic compilation as
a particular instance). Basically, every time you save a source file, it
fires off a new make job on another processor. If the compile fails due
to errors -- fine, throw it away (but keep the stderr file). If it
suceeds -- great, now wait for the user to type make!


Rather similar to speculative execution being explored in processor
architecture. Of course, you may need more hardware, but lots of us work
on big networks of machines. Or we could even attempt the compile in
background -- after all, I don't use much of the machine to edit. Of
course, with a network, it makes more sense to fire up a big parallel make
(imagine we're editing an important include file).


I would assume that some of these ideas have also been explored in the
programming environment community.


Preston Briggs
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.