Related articles |
---|
Language Processor Generators ckminer@lance.colostate.edu (1992-02-18) |
re: Language Processor Generators tarvydas@turing.toronto.edu (Paul Tarvydas) (1992-02-20) |
Newsgroups: | comp.compilers |
From: | Paul Tarvydas <tarvydas@turing.toronto.edu> |
Keywords: | tools |
Organization: | Compilers Central |
References: | 92-02-086 |
Date: | Thu, 20 Feb 92 14:49:36 EST |
My company has been solving clients' problems by designing little
languages since 1985. Most of these have been implemented in S/SL and
data descriptors. I doubt that, for quicky language prototypes, you'd be
happy with either of these, and you'd be much less happy with Yacc and
Lex.
Unlike our moderator, -John, I'm more positive about the prospects for
automated language construction tools.
Your best bet would be to explore TXL - the Turing eXtender Language which
has already been extended beyond Turing. An announcement of its
ftp-ability was posted to this newsgroup sometime last year (contact me if
you can't find the details). TXL is a tool which allows you to mutate
existing languages by providing extensions to the syntax and semantics of
some base language. The result compiles down into an existing language
(eg. Turing, C) so all of the schloky details of compilation are punted to
an existing compiler. TXL has been used to graft concepts, such as OOP,
onto procedural languages.
The other place to look might be "denotational semantics" (dn). The
hurl-index of this stuff is exceptionally high, but if you can get your
hands on an actual implementation, you might be able to use it to
prototype language constructs. If you happen to think that
"call-with-continuation" is the *only* construct worthy of consideration
in Scheme, then dn is for you.
The most noteworthy, practical, albeit unavailable, implementation of dn
is described in [1]. Peter Lee developed a method of partitioning dn in
such a way as to make it actually runnable on a PC. He does this by
implicitly rediscovering a subset of compiler technology which has been
around for decades. If you split a description of a language up into 3
pieces (scanner/parser, semantics, coder) and give the pieces funny names,
then you can automatically generate a compiler which can run in finite
space and time. Someday, the dn guys, if they keep their noses to the
grindstone, will rediscover the allocation and various optimization
pieces, too.
The good part about TXL and dn is that they're declarative. You can
forget about details (like data structure, control flow and memory
management problems) and concentrate solely on expressing the meat of the
problem. I've prototyped languages in Lisp, Eiffel, Smalltalk and Prolog
because they also give you this freedom from detail.
[In the long run, functional semantics is a nice way to build tools for
the construction of complex software. Analogous to what electrical
engineers do when they analyze complex circuits by suppressing sources,
functional languages will give us software guys the ability to chop up
systems into understandable "views" and then recombine them without
worrying about the side effects of doing so.]
[A good rule of thumb: when designing a little language, design it to have
declarative or functional or assign-once semantics. This allows you to
build in things, like ref-counting garbage collection, transparently.
General functional languages aren't a well-solved problem (at least not
from the practically-minded procedural guys' point of view), but it's
quite easy to do a good job on functional semantics in a restricted
problem domain (ie. the domain of a little language). The fact that the
little language's semantics are functional makes it a lot easier to
quickly design and implement good application-specific optimizations
(which work) into the little language compiler.]
> [If only life were so simple. How do you plan to specify a language in
> terms concrete enough that a computer can handle it? ... -John]
Have a look at [2] and [3] to see how easily this can be done using
declarative decision trees (you can even read the result without learning
ancient Greek or lambastic calculus).
[1] "Realistic Compiler Generation", Peter Lee, MIT Press 1989
[2] "Automatic Generation of Modular Semantic Analyzers from
Functional Specifications", Goran T. Janevski, Queen's University,
Kingston, Ontario, Canada, April 1990
[3] "Code Generation Using an Orthogonal Model" J.R. Cordy and R.C.
Holt, SP&E March 1990
Paul Tarvydas
TS Controls
tel. (416)-234-0889
fax. (416)-234-9193
uucp: tarvydas@turing.toronto.edu
[By fortuitous coincidence, a message announcing a new version of TXL was
posted today. -John]
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.