Re: specifications (was Re: Languages: The Bigger the Uglier)

bobduff@world.std.com (Robert A Duff)
24 Mar 1996 21:51:18 -0500

          From comp.compilers

Related articles
[6 earlier articles]
Re: specifications (was Re: Languages: The Bigger the Uglier) hbaker@netcom.com (1996-03-17)
Re: specifications (was Re: Languages: The Bigger the Uglier) jgj@ssd.hcsc.com (1996-03-20)
Re: specifications (was Re: Languages: The Bigger the Uglier) bobduff@world.std.com (1996-03-22)
Re: specifications (was Re: Languages: The Bigger the Uglier) pardo@cs.washington.edu (1996-03-22)
Re: specifications (was Re: Languages: The Bigger the Uglier) jgj@ssd.hcsc.com (1996-03-22)
Re: specifications (was Re: Languages: The Bigger the Uglier) hbaker@netcom.com (1996-03-23)
Re: specifications (was Re: Languages: The Bigger the Uglier) bobduff@world.std.com (1996-03-24)
Re: specifications (was Re: Languages: The Bigger the Uglier) bobduff@world.std.com (1996-03-24)
Re: specifications (was Re: Languages: The Bigger the Uglier) jgj@ssd.hcsc.com (1996-03-25)
Re: specifications (was Re: Languages: The Bigger the Uglier) ok@cs.rmit.edu.au (1996-03-27)
Re: specifications (was Re: Languages: The Bigger the Uglier) fjh@cs.mu.OZ.AU (1996-03-27)
Re: specifications (was Re: Languages: The Bigger the Uglier) bobduff@world.std.com (1996-03-27)
Re: specifications (was Re: Languages: The Bigger the Uglier) tkb@sol.newnet.navy.mil (T. Kurt Bond) (1996-03-27)
| List of all articles for this month |

From: bobduff@world.std.com (Robert A Duff)
Newsgroups: comp.compilers
Date: 24 Mar 1996 21:51:18 -0500
Organization: The World Public Access UNIX, Brookline, MA
References: 96-02-226 96-03-123 96-03-151 96-03-164
Keywords: standards

I wrote:


>> This could only work if the language has no implementation-defined
>> features, and no non-determinism...


Jeff Jackson <jgj@ssd.hcsc.com> wrote:


>I disagree.


I disagree with your disagreement.


>... The implementation doesn't have to generate wonderful
>code. We can build into it all the run-time bounds checking and
>analysis we want to verify assumptions and warning of use of
>implementation-defined features.


OK, I don't mind if it's 10 or even 100 times slower than a "real"
implementation. But we're talking about non-determinism, and that can
blow up combinatorially.


For example, consider a language like Ada with multi-tasking. Almost
any interleaving of tasks is legal. The "standard" implementation
would have to execute all possible interleavings, which is so
inefficient, it would typically take longer than the current age of
the universe. It's not just a matter of adding a few checks here and
there.


>... Thus if the standard requires that
>reference parameters not alias each other or any other variable
>visible, it could generate code to verify that this is in fact the
>case.


True. But what if the langauge allows pass-by-reference or
pass-by-copy, depending on the whim of the compiler writer, but
neither one is considered "wrong". Ada is an example -- aliasing is
allowed, but you might get different results depending on aliasing.
The standard implementation would have to try both, and both for any
nested calls, and so on, leading to combinatorial explosion. To see
if a given implementation is correct, you would have to compare its
results to ALL POSSIBLE results from the "standard" implementation.


>... If the order of some evaluation is undefined, it could go out
>if its way to make sure the order is random each time (thus exposing
>hidden assumptions in the test program).


Not good enough. The original assertion was that you could try a
program on the "standard" implementation, and compare the results with
some "real" implementation to see if the real implementation is
correct. But the real implementation might choose a different (but
still correct) order of evaluation, which would not match the standard
one. No, the standard implementation has to try ALL POSSIBLE orders,
and report all possible correct results. Combinatorial explosion.


>... If the value of an
>expression is undefined in the presense of overflow, it could generate
>run time code to detect overflows and generate random numbers as a
>result, or better yet, tag any variable as being undefined and
>generate a report in the end about behaviours that can't be relied
>upon.


True, but again, that doesn't tell you whether any given
implementation, which gives *different* results from the standard
implementation, is a correct implementation of the language.


The goal is to unambiguously specify language semantics. As long as
languages allow non-determinism, then the notion of a "standard
implementation" doesn't solve that problem. Now, we can argue about
whether non-determinism is a good idea...


Our moderator says:
>[I'd be pretty pessimistic about the possibility of getting such a reference
>implementation even as well debugged as your typical standards document. -John]


Well, yeah, that too. This more mundane issue is more relevant in
practise.


- Bob


--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.