|What is an interpreter? email@example.com (Paul Robinson) (1993-05-08)|
|Re: What is an interpreter? firstname.lastname@example.org (1993-05-09)|
|Re: What is an interpreter? email@example.com (1993-05-09)|
|Re: What is an interpreter? firstname.lastname@example.org (1993-05-10)|
|Re: What is an interpreter? email@example.com (1993-05-10)|
|Re: What is an interpreter? firstname.lastname@example.org (1993-05-11)|
|Re: What is an interpreter? email@example.com (1993-05-13)|
|From:||firstname.lastname@example.org (Stavros Macrakis)|
|Organization:||OSF Research Institute|
|Date:||Tue, 11 May 1993 23:53:41 GMT|
Paul Robinson <email@example.com> writes:
How do we determine when something is a "real" interpreter of a
"real" language, and when it doesn't quite reach that point?
Most of the postings on the subject concentrate on the issue of "how
powerful" a notation has to be to qualify as a "language". Although I'd
agree with many of the definitions if what we were after was a definition
of a "general-purpose programming language", there are many other kinds of
useful and interesting language around.
To my mind, almost any non-trivial notation is some kind of language, not
in the formal sense (regular expression language, context-free language,
etc.), but in a very pragmatic sense. For instance, the
spreadsheet-formula language has some interesting properties, even when
formulas are restricted to arithmetic expressions. The YACC language (the
part that isn't C) has some interesting properties.
I'd agree that something like the command-line language for gcc is pretty
trivial, but how about the makefile language?
All of these "little" languages or "embedded" languages or "sub"languages
are worth taking seriously, both in their design, and in their use.
Otherwise, we're doomed to (continue to) have bad ones.
Return to the
Search the comp.compilers archives again.