Semantics Tools

eifrig@beanworld.cs.jhu.edu (Jonathan Eifrig)
Tue, 25 Aug 1992 23:37:29 GMT

          From comp.compilers

Related articles
Re: constant folding at parse time twpierce@amhux1.amherst.EDU (Tim Pierce) (1992-08-19)
Re: constant folding at parse time drw@euclid.mit.edu (1992-08-24)
Semantics Tools eifrig@beanworld.cs.jhu.edu (1992-08-25)
Re: Semantics Tools Jonathan.Bowen@prg.oxford.ac.uk (1992-08-26)
Re: Semantics Tools db@dcs.ed.ac.uk (Dave Berry) (1992-08-26)
Re: Semantics Tools eifrig@beanworld.cs.jhu.edu (1992-08-28)
Correctness Proofs of Compilers fabio@dcs.edinburgh.ac.uk (1992-09-18)
| List of all articles for this month |

Newsgroups: comp.compilers
From: eifrig@beanworld.cs.jhu.edu (Jonathan Eifrig)
Organization: The Johns Hopkins University CS Department
Date: Tue, 25 Aug 1992 23:37:29 GMT
References: 92-08-114 92-08-147
Keywords: parse, optimize

Our Moderator asks:
>[Are there specification languages that would allow you do define your
>language precisely enough to tell you mechanically whether a proposed
>optimization was valid? -John]


In general, of course not: any such tool could, among other such
miraculous feats, decide the halting problem. I think that the best one
could hope for is a mechanical proof checker/proof development tool, sort
of like Nuprl for code transformations.


Curiously, I was wondering something related: How many people are
using formal semantic methods to either (1) generate a compiler or (2)
prove a compiler correct in "real life?" My gut feeling was "not many,"
but then I started thinking about some of the transformational and
continuations-based compilation approaches that have been published in the
last few years, and started wondering.


In principle, it is possible to write an operational semantics for
a language in, say, natural deduction style, and mechanically translate
this into CPS-style interpreter for the language, using the proof-rules as
primitive operations of the interpreter, if the semantics for the language
has certain reasonable properties (basically, the proof rules have to be
"deterministic": no two proof rules should have unifying conclusions.)
Partially evaluating this interpreter then yields _some_ sort of compiler,
albeit an inefficient one. I don't know of any example of such a
compiler, however. More likely, the proof rules are used to verify that
the initial translation of the source language into some standard
intermediate language is faithful, and then the intermediate language
compiled. (See kelsey89a.)


As long as we're on the subject of semantics tools, I have to give
a plug to the folks from INRIA and their Centaur system. It's a package
of integrated tools, allowing one to (1) define an abstract syntax for a
language and its concrete representation, automatically constructing a
parser and lexer for the language, (2) define various "pretty-printers"
for the abstract syntax, yielding printers and syntax-directed editors of
various sorts, (3) give a natural-deduction style operational semantics of
the language, similar to the Harper-Honsell-Plotkin LF system,
automatically constructing an interpreter for the language, and (4) create
an integrated environment for playing around with your new language, via
X. It's a great tool for playing around with little languages with novel
features. It's not free, but not real expensive, either; only a few
hundred bucks. Contact "centaur-request@mirsa.inria.fr" for more info.


Ref's:
------


@inproceedings{harper87,
author = "Robert Harper and Furio Honsell and Gordon Plotkin",
title = "A Framework for Defining Logics",
booktitle = lics2,
year = 1987,
pages = "194--204",
abstract = "The Logical Framework (LF) is a sytem for defining a wide
class of logics. It is based on a general treatment of syntax, rules, and
proofs in terms of a typed lambda-calculus with dependent types. Syntax
is treated in a style similar to, but more general than, Martin-Lof's
system of arities. The treatment of rules and proofs focuses on the
notion of a {\em judgement}. Logics are encoded in the LF via a new
principle, the {\em judgements as types} principle, whereby each judgement
is identified with the type of its proofs. This allows for a smooth
treatment of discharge and variable occurence conditions and leads to a
uniform treatment of rules and proofs whereby rules are viewed as proofs
of higher-order judgements and proof checking is reduced to type checking.
An important benefit of our treatment of formal systems is that
logic-independent tools such as proof editors and proof-checkers can be
constructed."
}


@inproceedings{kelsey89a,
author = "Richard Kelsey and Paul Hudak",
title = " Realistic Compilation by Program Transformation",
booktitle = popl16,
year = 1989,
pages = "281--292",
abstract = "Using concepts from denotational semantics, we have produced a
very simple compiler that can be used to compile standard programming
languages and produces object code as efficient as that of production
compilers. The compiler is based entirely on source-to-source
transformations performed on programs that have been translated into an
intermediate language resembling the lambda calculus. The output of the
compiler, while still in the intermediate language, can be trivially
translated into machine code for the target machine. The compilation by
transformation strategy is simple: the goal is to remove any dependencies
on the intermediate language semantics that the target machine cannot
implement directly. Front-ends have been written for Pascal, BASIC, and
Scheme and the compiler produces code for the MC68020 microprocessor."
}
--
Jack Eifrig (eifrig@cs.jhu.edu) The Johns Hopkins University, C.S. Dept.
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.