Re: Grammars for future languages
Tue, 24 Oct 1995 19:42:40 GMT

          From comp.compilers

Related articles
Grammars for future languages (1995-10-22)
Re: Grammars for future languages (1995-10-24)
Re: Grammars for future languages (1995-10-24)
Re: Grammars for future languages (Roy Ward) (1995-10-26)
Re: Grammars for future languages (1995-10-26)
Re: Grammars for future languages timd@Starbase.NeoSoft.COM (1995-10-30)
Re: Grammars for future languages (1995-11-09)
Re: Grammars for future languages (1995-11-04)
Re: Grammars for future languages (1995-11-05)
[11 later articles]
| List of all articles for this month |

Newsgroups: comp.compilers
Keywords: parse, design
Organization: In Mind, Inc.
References: 95-10-103
Date: Tue, 24 Oct 1995 19:42:40 GMT (Michel Schinz) writes:
>Algol-like grammars are believed to be easier to understand and closer
>to the usual (mathematic) notations and english. On the other hand,
>they have problems: they are big (hard to learn and remember) and the
>operator/function-call distinction is a big problem. For example, in
>C++ you can overload existing operators but you cannot define new
>ones. In Eiffel, you can define new operators, but you cannot define
>their priority and associativity.

>My claim is that this may be true for people who already know a
>"classical" programming language (Pascal, Basic, etc.) but I do not
>think that this is true for complete beginners, who certainly do not
>understand any of them.

But should languages be designed for the "complete" beginner, which
then penalizes everybody (including the beginner after a few hours)
or for the professional programmer who can be expected to spend
more time learning his tools. I have maintained elsewhere that syntax
which is TOO simple hurts, just as syntax which is TOO complex. But
a large grammer doesn't mean that it is too complex, it is the number
of special cases which hurt. If the language design is orthogonal, a
large grammer can be easier to learn than a smaller grammar.

> This intention comes to grief on the reality that syntax
> isn't what makes programming hard; it's the mental
> effort and organization required to specify an algorithm
> precisely that costs.

Exactly. And the better the notation you have available (which
generally means a more complex syntax) the easier is to reason
about the algorithm and the easier it is to specify. It has been
said that you cannot "think" about something (at least not easily)
for which you don't have words. As I have said elsewhere, "notation
is everything". It dictates your ability to reason, a good notation
can lift up the reasoning process to an entirely new level. A bad
notation can severely depress it.

An example is Roman Numerals (bad <g>) and Arabic Numerals
(good <g>). It is only recently that managable algorithms have
been developed for multiplication and division using Roman
Numerals. Precisely because they make it hard to to think about
the problem. Addition is difficult enough.

>For example, I remember clearly that when I learned my first
>"programming language" (Commodore 64 Basic :-), I had troubles
>understanding the concepts, not the syntax.

Sure, but was that true for the SECOND procedure language. Or
for that matter, was it true the second day? (I also learned on
Basic, only for me it was the RCA 70/46 timesharing Basic -- and
sure it took about an hour to learn Basic and programming, but
by the second day I was entirely confident). And even it takes
some people longer to learn, they still only have to learn once.

>Also, even if being close to the mathematical notation was once very
>important, because the vast majority of programs used mathematics a
>lot, this isn't true anymore.

While most programs do not use mathematics directly, they still use
mathematics and logic indirectly. Consider the common task of
negating the logic value of an expression. Also, as long as arithmetic
and other operators are useful (as in C), you will find that a notation
derived from mathematics is still the best for thinking. Even removing
precedence (at least for common operators) can hurt severely.

>Ok, there are still a lot of
>mathematical programs, but there is also a wide range of computer
>applications which simply do not need a special notation for
>arithmetic operations (compilers are an example).

I find that as I grow in ability and experience, that I lean more and
more towards a formal approach to programming. That means
class invariants, preconditions, postconditions and other invariants.
And these ARE frequently best expressed in a mathematic form. My
greatest bane at the moment is that fact that I don't have Unicode
as the medium for my programming language (along with through
editor and operating system support). The 8-bit character set is
just too restrictive.

>Simple and uniform grammars also have a great advantage when one wants
>to add new features to a language, like object-oriented capabilities.
>With simple grammars, user-defined constructs look just like
>predefined constructs.

But it is not the only way that can be accomplished. I am working
on a replacement for C++ which will compile C++ programs, but
which do NOT have any of the built-in operators or data elements.
I have found ways to allow the operators and their precedence
and the "type" of the operator (prefix, infix, postfix, multifix) to
be specified by the programmer. I have found ways to tie literals
to user-defined classes. I have found ways to tie old-style data
type declarations to user-defined classes. And it isn't really that

What is hard is defining new control structures in a seamless manner.
I know more than 20 control structures and more than 30 associated
modifiers. This is the basis for my reasoning when I program. Could
all of those be available in a language using only a simple syntax?
I believe that using an orthogonal syntax the answer is yes. But not
a small syntax (just because of the number of keywords).

>I therefore think that grammars for new languages should not be
>Algol-like but Lisp- or Smalltalk-like (or anything similar).
>Obviously, however, some people do not agree on this.

I believe that the reason that the procedural notation is so common
is that it is the most effective notation currently known. Sure,
there are sometimes problems extending the notation (C++ is a perfect
example). But that simply means that more research needs to be done
to integrate the new features in a consistent fashion -- which is
part of what I am doing with the C++ replacement project. However,
one of the constraints I have there is C++ compatibility. That DOES
limit the choices to some degree.

>I think that this issue is an important one, because if all new
>languages are designed to have a simple grammar, parsing could slowly
>become much easier, and its importance in compilation would decrease.

The effort required to parse the language by the compiler (and,
generally the amount of work needed by the compiler writer) is not
really a relevant issue during language design. It only needs to be
shown that compilation IS possible. Syntax changes made to make the
compiler writer's job easier just hurt the user. For example, in C++
inline functions really should not be required to be defined before
their first usage. As long as they have been declared by a header,
the actual definition should be allowed to occur anywhere in the
translation unit. This is not very hard to do...just delay making
the inline substitutions until AFTER the entire translation unit has
been processed.

Michael Lee Finney

Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.