Languages: The Bigger the Uglier (was: Re: Aliasing in ISO C)

rfg@monkeys.com (Ronald F. Guilmette)
19 Feb 1996 16:01:44 -0500

          From comp.compilers

Related articles
Possible to write compiler to Java VM? (I volunteer to summarize) seibel@sirius.com (Peter Seibel) (1996-01-17)
Re: Aliasing in ISO C rfg@monkeys.com (1996-02-16)
Re: Aliasing in ISO C jplevyak@violet-femmes.cs.uiuc.edu (1996-02-16)
Languages: The Bigger the Uglier (was: Re: Aliasing in ISO C) rfg@monkeys.com (1996-02-19)
Re: Languages: The Bigger the Uglier (was: Re: Aliasing in ISO C) jgm@CS.Cornell.EDU (1996-02-19)
Re: Languages: The Bigger the Uglier CSPT@giraffe.ru.ac.za (Pat Terry) (1996-02-20)
Re: Languages: The Bigger the Uglier (was: Re: Aliasing in ISO C) Martin.Jourdan@inria.fr (1996-02-21)
Re: Languages: The Bigger the Uglier (was: Re: Aliasing in ISO C) robison@kai.com (Arch Robison) (1996-02-21)
Re: Languages: The Bigger the Uglier (was: Re: Aliasing in ISO C) hbaker@netcom.com (1996-02-22)
Languages: The Bigger the Uglier (was: Re: Aliasing in ISO C) dave@occl-cam.demon.co.uk (Dave Lloyd) (1996-02-22)
[37 later articles]
| List of all articles for this month |
From: rfg@monkeys.com (Ronald F. Guilmette)
Newsgroups: comp.compilers
Date: 19 Feb 1996 16:01:44 -0500
Organization: Infinite Monkeys & Co.
References: 96-01-037 96-02-171 96-02-187
Keywords: C, standards, design

In article 96-02-187,
John B. Plevyak <jplevyak@violet-femmes.cs.uiuc.edu> wrote:
>What are we as language implementors to do when the even we cannot
>easily determine the meaning of a program because of "special
>exceptions" in the language definition?
>
>More than once I have heard it said that "I wouldn't dare" use this or
>that bit of small print for this or that optimization. The logic is
>clear: programs will fail mysteriously because the average programmer
>cannot be counted on to have read (and remembered) all the fine print.
>...
>I would contend that fine print in language definitions is just plain
>wrong, and that compilers which implement optimizations based on the
>fine print are asking for trouble...


The problem is not in compilers which take advantage of valid and
legal optimization opportunities.


The problem is also not one of fine print. (Most language standards
are set in very clear, very readable, good sized print.)


The problem is that many/most of these (language standard) documents
run to well over 300 pages long... some of them over 1000... and
``good'' pro- grammers should/must know every last passage and rule of
such language standards in order to be `safe' and to be sure that what
they wrote actually represents (to the compiler, and to the standard)
what they meant.


The problem is with languages and their standards. They are just too
big. That's because most of the consumers of these languages are just
too greedy for more and more features, and because they are just too
ignorant of the inherent limitations of the human brain to understand
these complex behemoths we call `modern programming languages'.


Hell! Even the people who sit on the drafting committees for these
recent langauges don't know _all_ of the details of what they are
voting for most of the time.


I see only one solution. We must at some point introduce some
disipline into the process of crafting languages and their standards.
There is one good way to do that which would also result in the
languages themselves being not only smaller, but less buggy. We need
to insist that language standardization committees write complete
FORMAL SPECIFICATIONS for the arcane and baroque languages that they
are trying to standardize. If we (the consumers) did this, you would
quickly see the wheat being separated from the chaf, and large numbers
of overly complex features being tossed overboard (like so much
ballast), much to the benefit of the average programmer-on-the-street
who has to code in these unfortunate and massively over-engineered
travesties of programming languages that we have today.


P.S. It amuses me to see people still arguing about Ada versus C++.
They are both awful. Let's argue about the proper locations for the
deck chairs on the Titanic for awhile, eh? The enemy is not Ada or
C++ or loose type checking or rigid type checking. The enemy is
complexity, and it is winning. Just look at the thickness of three
separate revision of the COBOL standard standard (i.e. 68, 74, and 85)
if you don't believe me.


There seems to be something about the adult human mind that yearns for
com- plexity in much the same way as the mind of a child yearns for
candy. We must eventually learn to disipline ourselves against this
or else we will ultimately drown in a sea of complexity of our own
making.


How thick will the COBOL standard of the year 2015 be? Five thousand
pages? Ten thousand? More?
--
-- Ron Guilmette, Roseville, CA -------- Infinite Monkeys & Co. ------------
---- E-mail: rfg@monkeys.com ----------- Purveyors of Compiler Test Suites -
[Formal defs may help, but having attempted to make sense of the PL/I
standard, I can't see them as a solution to bloat. The Cobol crowd at
least divides their language into sublanguages that you can understand
and add to your implementation one at a time. -John]




--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.