Re: Threaded Interpretive Languages

pop <>
Tue, 28 Sep 1993 14:50:28 GMT

          From comp.compilers

Related articles
Threaded Interpretive Languages (Andrew Tucker) (1993-09-14)
Re: Threaded Interpretive Languages (Julian V. Noble) (1993-09-21)
Re: Threaded Interpretive Languages (1993-09-23)
Re: Threaded Interpretive Languages (Nigel Chapman) (1993-09-24)
Re: Threaded Interpretive Languages (1993-09-26)
Re: Threaded Interpretive Languages (pop) (1993-09-28)
Re: Threaded Interpretive Languages (1993-09-28)
Re: Threaded Interpretive Languages (1993-09-29)
| List of all articles for this month |

Newsgroups: comp.compilers
From: pop <>
Keywords: code, bibliography, history, comment
Organization: Compilers Central
References: 93-09-080
Date: Tue, 28 Sep 1993 14:50:28 GMT

Julian Noble's article prompts me to respond.

> The prototypical TIL is FORTH. There is a newsgroup comp.lang.forth on
> which you can read the often convoluted expressions of dedicated FORTH
> programmers. A nice article on the history of FORTH appeared in BYTE
> Magazine September 1980 (written by its creator, Dr. Charles Moore).

It depends on what you mean by "prototypical". Clearly, it is the best
known. The 1980 article is not exactly precise about dates of the various
developments. The POP-1 language (Package for Online Programing) that we
developed in Edinburgh University in 1966-68 used TIL technology combined
with reverse polish notation. Syntax was quite like early Forth, using
FUNCTION and END delineation of function definitions. This is described in
the Machine Intelligence Series:

Popplestone R.J.[1968] "POP-1: An Online Language" in Machine Intelligence
2, Eds. Dale,E. and Michie D, Oliver and Boyd, Edinburgh Scotland.

Threaded -compiled- code arrived with POP-2 in 1968, which also used a
compiler providing an Algol-like syntax, developed from Rod Burstall's
REVPOL function written in POP-1, which -produced- the reverse-polish form
as output.

Burstall, R.M. and Popplestone, R.J., [1968] The POP-2 Reference Manual,
Machine Intelligence 2, pp. 205-46, eds Dale,E. and Michie,D. Oliver and
Boyd, Edinburgh, Scotland.

> In many ways FORTH is like Lisp (or perhaps Lisp is like FORTH? :-), ex-
> cept it uses postfix ("reverse Polish") rather than prefix notation.

Well... certainly LISP came first. An important dimension of difference
is garbage collection. If you garbage collect in a non-statically-typed
language you are rather forced into tagging (I don't regard conservative
collection as a serious option). Forth, if I understand it rightly, is not
tagged (an advantage for systems applications) and not garbage collected.
POP-1 and POP-2 were, and POP-11 is. This was seen as essential for the
symbolic programming needed for AI. However it made for implementations
which were a bit complicated for early micros. Bill Clocksin did a
version for the DEC-11 in 1968, called "Poppy", I seem to recall.

> In addition, the programmer has direct access to the main stack (parameter
> stack) of the cpu. Stack access + RPN greatly reduces the overhead of
> subroutine calls, hence encourage the fine-grained decomposition of
> programs into small functional units that do only one thing. For this
> reason FORTH programs tend to be vastly shorter and to use much less
> memory than equivalent programs in other languages. Typical automated
> measures of "program complexity" applied to FORTH code yield very low
> indices even for programs I would regard as complex.

Now it is clearly -false- that "Stack access + RPN greatly reduces the
overhead of subroutine calls". RPN is quite irrelevant - any syntax can be
mapped into reverse polish form, and in elementary compilers often is.
Requiring stack access constrains a language implementation in a way that
maps quite well onto many complex instruction sets, but poorly onto RISC.
What -is- true is that complex (but still inadequate) rules for handling
free variables, as in Pascal, do make it hard to have low-overhead
function calls. But the C language is less constrained than more direct
descendents of Algol 60. Any technique used for implementing function call
in FORTH is equally applicable to C, but not conversely. In fact, the
-open stack- if it contains only arguments imposes an overhead in that a
separate pointer to it is required, using up a precious register.

Where there is -gain- in writing these languages over others is typically
that it is much less work to do in I/O, generating test data etc.. The
availability of a compilation mechanism at run-time also saves a lot of

Moreover with a language that supports in-lining [like C++ (not my
favourite by the way) or Haskell say] a big win is possible, since a small
function may inline in less space than its call takes, and is of course
faster. This is directly opposed to threaded-code, where by definition the
small function -must- be called, since it may be redefined. Typically
threaded code is useful for -debugging- but it is best converted into an
unthreaded form for use in a module like a mailer where it will be
normally used unchanged.

There are -advantages- to an open stack, but they are not those claimed.

(a) By providing a simple method of passing arguments it is easier to
define certain higher-order functions. Thus -newarray- in POP-2/POP-11
takes a list of bounds and a function of n-arguments, and returns a
-memoised- version of the function. It is easier to implement various
kinds of memoization in a general way with an open stack.

(b) The open stack provides what is in effect cheap storage, and supports
cheap multiple-value return in a way that allows tuples so returned to be
associatively combined. For example, the POP-11:

define flat(T);
    define flat1(T);
        if is_node(T) then flat1(left(T)); flat1(right(T));
        else T
    [% flat1(T) %]

works by -stacking- the leaves of the tree in the inner -flat1- function.
Rod Burstall's [% .... %] construction converts the stacked values to a
list on the heap. This function is O(n), whereas an implementation using
-append- would be O(n^2). And any O(n) implementation in a non-open-stack
language is conceptually complicated by having to pass an argument in
which the result can be accumulated.

> FORTH is incrementally compiled. The compilation mechanism is part of the
> language and its components accessible to the programmer. This latter
> feature distinguishes FORTH from all other languages I know. (Maybe Lisp
> does it too? Don't know enough about it to say.) As a result, the user can
> try something out as soon as it is defined. For example (a trivial one, to
> be sure!) suppose we define a new FORTH subroutine (everything in FORTH is
> a subroutine, called a "word", just as everything in C is a function):

> : *+ * + ; ( a b c -- b*c+a)

POP-11 is rather more verbose. The direct equivalent:

        define *+ (/*a,b,c*/); * + /* b*c + a */ enddefine;

But you can write:

        define *+ (a,b,c); b*c+a enddefine;

> The colon ***is*** the compiler. It is a FORTH word that makes a new
> dictionary entry with the name *+ .

There are some disadvantages to this kind of approach. In fact Rod
Burstall urged that we should use an abstract syntax for POP-2, rather
than having the whole compiler driven by "immediate action" tokens. The
primary disadvantage is that any manipulation of a program is tricky. For
example, you might want to run some code through an algebraic simplifier,
perhaps with some parameters bound. This is in principle easy in LISP
(though in practice made difficult by the baroque way the language has
developed). Typically it means that LISP macros are more sophisticated
than POP-11 ones.

Significant program manipulation is hard in both POP-11 and FORTH - the
applicative form of a reverse polish is only determined -when- you know
the arity of the symbols. With threaded code, this is potentially
rebindable at any time. Thus the *+ example above might as well be logsin,
with a different applicative structure.


                3 4 5 *+ . 23 ok

> (The dot or period is a word that says "emit the number on the top of the
> stack, consuming it in the process". FORTH words usually consume their
> arguments.) We see the word *+ did precisely what it was supposed to,
> hence it is debugged and can be marked "tested" in the source code.


        3,4,5, . *+ =>

** 23

Here the "." says "do it" as opposed to pushing a pointer to the code on
the stack for supplying the function as an argument to another function.
The "=>" means "print the entire contents of the stack".

More conventionally

        *+(3,4,5) =>

** 23

> So one of the major advantages of a TIL like FORTH is the ease of
> developing and debugging code. None of the cumbersome accessories of the
> more mainstream languages -- programming editors, interactive debuggers,
> MAKE -- are needed in FORTH. This means FORTH can be written and debugged
> quickly on small machines.

Now here we come to the main issue. POP-1 -was- a small language. POP-2
larger and POP-11 has grown to be big by some people's standards, though
smaller than Common Lisp systems. Now various languages have made much
bigger market penetration because of being -at- the right time -in- the
right place and -with- the right facilities. Some facilities at some times
represent overkill, but they may be seen as essential later. The primary
facility that comes to mind is garbage collection, which -is essential-.
Other facilities that take a lot of space in Poplog are the Common Lisp
equivalent numbers (fractions, complex numbers ...) and the C-based widget

However both FORTH and C were conceived of as languages for -small-
machines and their role in the market place may be eroded as people
realise that machines are no longer small, but software packages are
-big-, and need to be written in languages that provide better means of
constraining programmers.

Systems like POPLOG which provide -semantic- support intended for a
variety of languages would seem to have a discintct advantage over both
FORTH and C.

> Another major advantage of FORTH is access to the compilation mechanism.
> The components of : (the standard compiler) can be used to create special-
> ized mini compilers for specific tasks. For example, I have been using for
> several years a simple mini compiler FSM: that turns a state-transition
> table into a finite state machine. Recently I read an article (before
> submission to a journal) on mini compilers that produce optimized machine
> code for a target cpu, then download it for testing (this kind of thing
> enormously speeds embedded systems development and real-time programming).

Poplog provides something of an -embaras de richesse- in this respect. As
standard you get:

(a) A set of procedure calls to generate code for the POPLOG Virtual Machine.
(b) A procedure call to compile a -lazy- list of tokens of the POP-11 language
into VM code.
(c) A procedure for compiling a -stream- of characters in the POP-11 language.
(d) A procedure for compiling a -stream- of characters in the SML language.
(e) A procedure for compiling a -stream- of characters in the Common
        Lisp language.
(f) A procedure for compiling a -stream- of characters in the Prolog language.

The next release is coming out with a YACC style parser generator.

> The above and related advantages are the chief reasons why FORTH users
> would rather fight than switch.

One knows the feeling...

> What are the disadvantages? Systems folk dislike FORTH on multi user
> systems, since it is virtually impossible to protect against a clever
> FORTH programmer screwing up the system. Thus FORTH has tended --with
> notable exceptions-- to remain on single-user environments.

Again, a -difference-. POP-2 was defined specifically to be unable to
break the system. This is because it was used as the sole programming
language in a time-sharing system which provided no hardware protection...
This rigour has been relaxed for POP-11, though beginners are not usually
told how to break the Poplog system. However FORTH can hardly be worse
than C - it should be able to break -just- a process and not a whole Unix
system. (But on reflection I would not be too sure - I have seen some
pretty flakey Unices..)

SML on the other hand is -very- type-safe.

> High level FORTH develops much more quickly, but generally runs up to 10x
> slower than equivalent C (albeit the code is also 10x shorter).

Well... I managed to get the -tak- benchmark to run 6x -faster- than C in
POP-11 for a particular set of big arguments by writing:

        memo_n(tak,3,1) -> tak;

That is "replace tak by its memoised version - by the way it takes 3
arguments and returns one result". Now this is perhaps cheating. But is
it? Since it changes the complexity class of the function, I can beat C by
any factor I like! This has some real utility - for example memoising can
typically provided an easy way to tune programs that chew repeatedly over
the same data. Algebraic simplifiers are one example of such.

Actually, the raw speed for systemy kinds of things seems to be that half the
speed of C if strictly comparable code is used.

> [But, as Winston Churchill said to a lady who complained he was drunk
> "Yes, Madam, and you are ugly. But in the morning I shall be sober." That
> is, one can speed up the FORTH very easily without notably increasing its
> code image, by using the built-in assembler. All FORTHs worthy of the name
> include an assembler that will allow definitions of words directly in
> machine language. The resulting words look and feel to the user exactly
> like words defined with : ... ; so there is no linkage step or separate
> assembly.

The only snag here is that assembler is non-portable. The C-interface of
POPLOG is to my mind preferable.

Robin Popplestone.
[It is my impression that FORTH gained popularity more for historical
reasons than technical ones, that it became popular in astronomical
circles because it ran on cheap little micros and application software
existed to control telescopes. MUMPS became popular for medical records
systems for similar reasons. -John]

Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.