Re: Randomized compilation order (was: failure due to compiler?)

Matthew B Grice <mgrice@iastate.edu>
26 Jul 1996 23:19:52 -0400

          From comp.compilers

Related articles
Re: failure due to compiler? kanze@lts.sel.alcatel.de (1996-07-04)
failure due to compiler? flake@elda.demon.co.uk (1996-07-09)
Re: failure due to compiler? cliffc@ami.sps.mot.com (1996-07-10)
Randomized compilation order (was: failure due to compiler?) masticol@scr.siemens.com (1996-07-13)
Re: Randomized compilation order (was: failure due to compiler?) darius@phidani.be (Darius Blasband) (1996-07-24)
Re: Randomized compilation order (was: failure due to compiler?) mgrice@iastate.edu (Matthew B Grice) (1996-07-26)
Re: Randomized compilation order (was: failure due to compiler?) pardo@cs.washington.edu (1996-07-31)
| List of all articles for this month |
From: Matthew B Grice <mgrice@iastate.edu>
Newsgroups: comp.compilers
Date: 26 Jul 1996 23:19:52 -0400
Organization: Iowa State University, Ames, Iowa
References: 96-07-041 96-07-056 96-07-070 96-07-091 96-07-172
Keywords: design

Darius Blasband wrote:
>
> > >> [Anyone got any insight into why in the world they made a non-deterministic
> > >> compiler? -John]
>
> Maybe as a way of exercizing the degrees of liberty in the language's
> definition so that a working program does not depend on a specific
> implementation for portability (Frankly, I don't think that is why they did
> so in the first place, but I guess it might be a possibility, for instance,
> regarding Ada tasking. A program that works on a machine might as well not
> work under another compiler if the scheduling strategy differs. One way of
> attempting to solve this problem would be for the compiler to exercize at
> random or pseudo-random various strategies for testing purposes...)


Didn't see the first part of this thread, but a common reason for doing
this type of thing is when the most-common-case and worst case performance
of an algorithm are very different, and the worst case is very rare.
Quicksort comes to mind. So put random factors in the algo, insuring that
a single input cannot cause worst-case performance every time (or even
a significant percentage of the times) it is run.
[I'm not persuaded. Quicksort's a bad example -- it's not hard to modify
quicksort to avoid the worst case behavior while still being completely
deterministic, and I'd be amazed if there were any compiler algorithms that
weren't tractable. -John]


--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.