From: | jens.hansson@mailbox.swipnet.se (Jens Hansson) |
Newsgroups: | comp.compilers |
Date: | 8 Mar 1996 00:15:38 -0500 |
Organization: | - |
References: | 96-01-037 96-02-187 96-02-226 |
Keywords: | C, standards, design |
>[Formal defs may help, but having attempted to make sense of the PL/I
>standard, I can't see them as a solution to bloat. The Cobol crowd at
>least divides their language into sublanguages that you can understand
>and add to your implementation one at a time. -John]
My suggestion would be:
Describe the semantics in a formal language that is machine-readable
and executable. I believe this is the only way to prove that a
language is "formal" enough. On top of that, this should save the
compiler writers a lot of work -- you could just use the
specification. If you want to do it better (i.e. improving
performance) you have an executable to generate test-vectors from.
The problem with this approach is that the language in itself might
*allow* differences in implementation, like C allows different integer
sizes for different machines. Freedom in byte-ordering (more serious
than the C int problem) may also create a problem if the formal
language is too strict. I suppose all modern architectures use two
complements for negative integers, but old computers may have
sign/magnitude representation of them.
As there *are* fully working compilers (at least near bug free) there
shouldn't be a technical problem to write an executable formal
spec. Right?
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.