|The melting ice technology (1): compilers & interpreters firstname.lastname@example.org (1994-05-09)|
|The melting ice technology (2): levels email@example.com (1994-05-09)|
|Re: The melting ice technology (2): levels firstname.lastname@example.org (1994-05-12)|
|Re: The melting ice technology (2): levels email@example.com (1994-05-13)|
|Re: The melting ice technology (2): levels firstname.lastname@example.org (1994-05-13)|
|Re: The melting ice technology (2): levels email@example.com (1994-05-14)|
|Re: The melting ice technology (2): levels firstname.lastname@example.org (1994-05-16)|
|From:||email@example.com (Michael Coffin)|
|Organization:||University of Waterloo|
|Date:||Fri, 13 May 1994 15:00:31 GMT|
[re Bertrand Meyer's articles on compilation and interpretation]
As near as I can tell, by your definition, all language translation
systems are both compilers and interpreters, which makes the distinction
The definitions I'm more familiar with are that a _compiler_ generates
machine code, which is executed by a CPU, while a _translator_ produces
data which is then "interpreted" (hence the name "interpreter") by another
program. By these definitions it sounds like the system in question is an
interpreter that incorporates a large compiled library. I.e., a very
nice, well-engineered interpreter, reminiscent of some Lisp systems.
I don't think there's any reason to be defensive about this, by the way.
Many very useful languages---Scheme, Lisp, Elisp, Perl, Icon, and Awk, for
example---are usually implemented using the same basic approach: an
interpreter with compiled versions of the time-critical pieces to avoid
most of the performance penalty. Although some of them allow compilation
of code as well as interpretation (Icon and the Hobbit comiler for Scheme,
for example), none of them I'm familiar with seem to integrate
interpretation and compilation as nicely as Melting Ice.
Return to the
Search the comp.compilers archives again.