Re: Are these all really true ? (Richard A. O'Keefe)
Tue, 3 Oct 1995 05:22:03 GMT

          From comp.compilers

Related articles
[19 earlier articles]
Re: Are these all really true ? (1995-09-28)
Re: Are these all really true ? (Rodney Bates) (1995-10-03)
Re: Are these all really true ? (Jeremy Carroll) (1995-09-29)
Re: Are these all really true ? (Stefan Monnier) (1995-10-02)
Re: Are these all really true ? (Scott Nicol) (1995-10-02)
Re: Are these all really true ? (1995-10-02)
Re: Are these all really true ? (1995-10-03)
Re: Are these all really true ? (1995-10-16)
Re: Are these all really true ? (1995-10-04)
Re: Are these all really true ? (1995-10-11)
Re: Are these all really true ? (1995-10-12)
| List of all articles for this month |

Newsgroups: comp.compilers
From: (Richard A. O'Keefe)
Keywords: design, specification
Organization: Comp Sci, RMIT, Melbourne, Australia
References: 95-09-076 95-09-108 95-09-134
Date: Tue, 3 Oct 1995 05:22:03 GMT

>* Formal specifications yield correct programs.

A number of posters have attacked this claim, and I've received E-mail
arguing against it. It seems to me that the people who have this negative
view of formal specification are working from assumptions that differ
radically from mine.

For me, a good formal specification is an
  - intellectually manageable
  - "very high level" description in a language with defined semantics
  - that abstracts away from implementation
  - and not only *can* be subjected to a wide range of static checks
      but *has* been so subjected
  - which may even be executable, so as to provide a means of debugging
      ones test library.

I do *not* expect a formal specification to cover *every* aspect of a
program in all cases. For example, I think the work done by the Ada 95
"Language Precision Team" was an appropriate use of formal techniques:
they isolated difficult issues, specified them formally, and proved or
disproved that certain techniques would or would not work. They were
using the methods of formal specification to solve design issues. It
didn't _matter_ if they "fell out of synch with the code"; what mattered
was that a large number of plausible but flawed designs had been pruned

A particularly important point about formal specifications is that they
can be type checked, scope checked, cross-referenced, and so on. That
is not enough to prove that what you are saying is meaningful or correct,
but again, the point is that it _can_ prove that what you are saying is
_not_ meaningful or correct. The payoff is in the bad specifications
that get caught early.

It's not that "formal specifications yield correct programs",
it's that "machine checkable specifications prune out many bad designs early".

For my current favourite example of where formal specification _would_ be
useful, and the lack of it has led to confusion, consider UNIX regular
expressions. ed, sed, grep, egrep, awk, nawk, and so on all have regular
expressions, all slightly different (and then there are the two regular
expression libraries that came with this UNIX system, incompatible with
each other and everything else). What do they actually _mean_? I've never
seen a formal semantics for them. The informal semantics is obviously
wrong: "leftmost longest" is what they are _said_ to do, but try testing

Yet a formal specification is a great way to start deriving an implementation.

This is comp.compilers. Do I need to point out here that the C standard is
_not_ a formal specification, and that it has been found to contain precisely
the kinds of holes and inconsistencies that machine checking could have
caught? Can anyone look at the extremely informal specification of scope
rules in the draft C++ standard without their thumbs pricking?

Richard A. O'Keefe;; RMIT Comp.Sci.

Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.