From: | Francois-Rene Rideau <rideau@ens.fr> |
Newsgroups: | comp.compilers,comp.lang.misc,comp.arch.arithmetic |
Followup-To: | comp.compilers,comp.lang.misc |
Date: | 14 Mar 1997 00:02:47 -0500 |
Organization: | None |
References: | 97-03-037 |
Keywords: | design |
>: hrubin@stat.purdue.edu (Herman Rubin)
> This is an example where there can be useful hardware which cannot be
> incorporated into a language, and/or the compiler could not do a good
> job with it. Both of these need to be changed; what is needed is that
> other operations can be added at will to the language, and that the
> compiler can be instructed to select from optimization techniques
> which the user can supply it, as well as from those which the compiler
> writer included. These will be machine dependent, and may even depend
> on what other instructions are running in the neighborhood.
I had the same idea back in '91, when investigating ways to
implement matrices in a programming language that should be both
formally accurate, and computationally efficient, i.e. you should be
able to write the same beautiful, clean, satisfactory, formulas as you
would write on a piece of paper, so as you can prove a few
mathematical theorems about them; yet, the compiler should handle
everything efficiently, automatically take into accounts symmetries in
the problem, optimize away computations with constant null and
identity matrices, etc.
Of course, I was very angry against the C++ language (that was the
only "OO" stuff I knew at the time), because it did promise
encapsulation, but obviously wasn't able to encapsulate *any* of my
meta-knowledge (not to talk about its lack of genericity, templates
being both unavailable at the time, and stupid even now).
However, I knew I couldn't, as a lonely pupil in math class,
rewrite a full compiler: I was sure the tenfold speed gained by
optimization in good cases would be completely lost in the overhead of
an interpreter, which was the only thing I could afford to write.
Still, that was the heart of what is now the Tunes project (see
below in my .sig), with which I'm periodically harassing people on the
Usenet (sorry, bored readers).
Since then, I've discovered languages with some optimization hints,
with all kinds of #pragmas (ADA), function attributes (GNU C), access
to manual low-level optimizations of atomic blocks (GNU C style inline
asm), or (declare)ations (Common-LISP). But these never allow to
automatize anything; if you want, say, an optimized FFT routine for
such CPU, you still have a *lot* of manual work to do (at least, you
must produce a flattened version of the routine, that the automatic
optimizers could work on). And don't expect the above tools to
automatically maintain the consistency between your low-level
optimized code and the high-level version (if, for instance, you
modify the dimension of the arrays/vectors on which to compute the
FFT).
Other people have talked about such things as "OI" (open
implementation), or MOPs (meta-object protocols), but always with a
very limited purpose, or in hackish ways that could not be reused
outside of their petty environments, often in ways that deny any
possibility of program analysis.
So, if you're interested in the development of a truely reflective
system, that allows specification and development of objects such that
you can
* modify the representation of objects (their operational semantics)
without modifying their abstract meaning (declarative semantics)
* more generally, provide meta-knowledge to the system
* have ways to statically prove/check the validity of meta-transformations
* more generally, safely meta-program.
then do join the Tunes project.
> [As always, I invite Herman to sketch out such a language so we can see
> what the concrete syntax would look like. I used IMP72, a language where
> you could add any operations you wanted, and it was awful. -John]
Well, I think we have to separate here a purely *semantical* problem
from possible *syntactical* complications: surely, some new syntax
could be added for new operators, and many languages have done it with
more or less success; but such thing is *completely* independent from
the fact of being able to declare optimization techniques to the
compiler.
== Fare' -- rideau@ens.fr -- Franc,ois-Rene' Rideau -- DDa(.ng-Vu~ Ba^n ==
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.