Re: Programming language and IDE design

George Neuner <>
Tue, 22 Oct 2013 16:17:26 -0400

          From comp.compilers

Related articles
Programming language and IDE design (Martin Ward) (2013-10-17)
Re: Programming language and IDE design (BartC) (2013-10-19)
Re: Programming language and IDE design (Hans-Peter Diettrich) (2013-10-20)
Re: Programming language and IDE design (Gene Wirchenko) (2013-10-20)
Re: Programming language and IDE design (Gene Wirchenko) (2013-10-21)
Re: Programming language and IDE design (George Neuner) (2013-10-22)
Re: Programming language and IDE design (Hans-Peter Diettrich) (2013-10-23)
Re: Programming language and IDE design (2013-10-22)
Re: Programming language and IDE design (BartC) (2013-10-23)
Re: Programming language and IDE design (Stefan Monnier) (2013-10-24)
Re: Programming language and IDE design (George Neuner) (2013-10-24)
Re: Programming language and IDE design (Martin Ward) (2013-11-07)
[17 later articles]
| List of all articles for this month |

From: George Neuner <>
Newsgroups: comp.compilers
Date: Tue, 22 Oct 2013 16:17:26 -0400
Organization: A noiseless patient Spider
References: 13-10-016
Keywords: tools, design
Posted-Date: 22 Oct 2013 16:35:45 EDT

On Thu, 17 Oct 2013 11:06:26 +0100, Martin Ward <>

>"Source code should be hard to write".

I disagree. Code should be easy to write ...

and easy to read ...

and, in almost all cases, easy to understand!

There are times when [well documented!!!] clever tricks are called
for, but in general, developers should be striving to write code that
is as boring and straightforward as possible.

I don't necessarily agree with the idea of limiting general purpose
languages, but I do think that sharp tools, by default, should be
reserved to "expert" modes.

Rather I agree with Hans-Peter that when suitable domain languages
exist, they ought to be used. My definition of "suitable", however,
presumes that the DSL is *practical* to use in real world situations.

Mathematica makes testing algorithms simple. Its apparent simplicity,
however, is a trap because Mathematica is, on average, 30x slower than
an equivalent C program. Even experienced users can too easily write
"programs" for which the running time against a realistically large
data set which will enable the user to take a foreign holiday.
Processing really _big_ data using Mathematica is just infeasible.

As a point of comparison, Lisp also makes testing algorithms pretty
simple. However, the combination of an experienced Lisp programmer and
a good Lisp compiler can closely rival a C program for speed.

>The justification for this statement is that one metric which correlates
>very strongly with maintenance effort is lines of code.
>If code is hard to write, then programmers will be motivated
>to spend fewer lines of code to implement a particular functionality.
>As Dijkstra wrote in "On the cruelty of really teaching computer science":
>"if we wish to count lines of code, we should not regard them as
>'lines produced' but as 'lines spent': the current conventional wisdom
>is so foolish as to book that count on the wrong side of the ledger."
>Similarly foolish wisdom is seen in IDEs (Integrated Development
>which allow the programmer to create hundreds of lines of code
>with just a few clicks of the mouse. Thus instantly creating a significant
>maintenance effort.

99% (or more) of generated code will never be touched by a human.
Probably 80% of it even will never be looked at.

I have never read anything which suggests to me that Dijkstra was - or
would have been - against source code generators. He had a well known
preference for typed, high level, structured languages rather than
assembly, but AIUI his main concern was with the control of stupid
programmer errors. He also was quite concerned with programmer
efficiency and coding effectiveness, noting in "The Humble Programmer"
that software systems were rapidly becoming too complex for a single
programmer to fully understand.

Given the second point, I believe he would have been in favor of
(carefully applied) source code generation. In particular, I think he
would have eagerly supported the idea of domain languages designed to
reduce the semantic gap between programmer and problem. I don't think
he would have been bothered much by the idea of DSL compilers
generating (good quality) programs in source form for an existing GPL
compiler. He would have preached discipline (another of his favorite
themes) in resolving problems within the domain language rather than
resorting to tweaking the generated GPL code.

>Another example of this "wisdom" (which in my
>experience is very common among commercial IBM assembler
>programmers) is when a programmer needs to implement a new function
>which is fairly similar to an existing function: he or she just grabs a copy
>of the existing module, changes a few lines of code and slaps it
>into production. Over time, the two copies drift further apart,
>bugs are fixed in one copy but remain lurking in the other copy,
>new programmers have two lots of code to read, which are
>confusingly similar, and so on.

Cut-n-paste happens with every programmer and in every language. I
have done it and I'll wager that you have done it and so too has
everyone you can find to ask.

The concern about unknown bugs being duplicated is valid, but consider

1) generally you start by copying code that is already known to work.
It may contain latent bugs, but if the code was thoroughly tested for
its original purpose, bugs that remain should pertain only to
conditions that could not arise in that use.

2) once the code is modified for a new use, it is new code. Latent
bugs present in the original may no longer be there or (as in 1) may
pertain to conditions that will not arise in the new use.

3) you *ARE* going to thoroughly test the new code for its intended

Cut-n-paste doesn't worry me so much as the decline of design and
development skills. The average software developer today has little
or no CS or CE education, and often little experience in the domain of
the application. Combined with easy access to great volumes of open
source code that they understand poorly or not at all, I consider the
majority of these people to be highly dangerous. IMNSHO, a lot of
"professional" software developers would be doing a world a favor by
finding another profession.

>So my assertion is "Source code should be hard to write": *even if*
>the writing difficulty does not result in any improvement in ease of
>reading! For example: before any programmer checks in a new module or
>program change, they should be forced to write out all the code
>longhand with a quill pen. This apparently pointless extra work
>should give programmer's the opportunity to think about whether there
>might be a shorter way to solve the problem (which might even be
>easier to read and understand).

Why stop at quills? Make them write the program in Sanskrit and carve
it into stone tablets ... *after* first fashioning their own copper
chisels and cutting stone for the tablets.

>This also affects programming language design:
>(1) Rather than use "{" and "}" for every type of grouping, each
>different type of group such as an if statement or while loop, should
>have its own opening and closing keywords. This is something that
>modern COBOL gets right (along with the many things it gets wrong!)

C's mistake was not in using (or over using or abusing) curly braces.
Rather, C's mistake was in defining the scope of its control
constructs to be a single statement rather than a demarcated list of
statements. Had C control constructs always expected a list of
statements - thus *requiring* the scoping braces - many stupid
programmer errors might have been avoided. [Lisp proponents would say
the error was defining the language in terms of "statements" vs
"expressions" ... but I don't go that far.]

I don't object to closing keywords [in fact I prefer Pascal-like
syntax, though Modula3 and Oberon are better examples than Pascal
itself] but I don't agree that they necessarily make errors easier to
spot for human readers. Closing keywords are nothing more than
syntactic punctuation, and punctuation in human language normally is
absorbed by readers at a subconscious level.

I despise stupid keyword reversal closings such as IF..FI - loop can
go jump in the pool and tceles sounds to me like a high speed train
(and probably is a homophone for a swear word in some language).

Things like <keyword>..END<keyword> get unwieldy when keyword phrases
may contain optional attributes, such as in generalized loop
constructs where the demarcation keywords may have to include labels
and/or conditions.

My preference is simply for a common END keyword used by all
constructs. I think keyword case consistency is important, but I think
the choice to use all upper or all lower should be left to the

What actually *is* important is not the choice of keywords, scope
markers or other punctuation, but rather that the scope of control
constructs is defined to be a list of statements. Given this,
indentation by a structural editor will immediately show structural
mistakes as misaligned statements.
[For an example, look at Lisp code using Emacs "lisp mode". Try
adding or removing a parenthesis somewhere and see what happens.]

>(2) A commonly accepted rule for designing databases and software
>systems is that each piece of information should be stored in only one

Singleton data doesn't prevent name aliasing, in databases or in
programs. It is the aliasing which is a major source of programming

Moreover, there is data for which multiple instances are
indistinguishable in use, e.g., the integer value 42. Non-shared uses
of such data can use copies interchangeably.

>With a programming language, it is better for the reader if
>information is duplicated to all the places where it might be needed,
>with the compiler enforcing consistency.

Yes and no.

Redundancy is good to a point beyond which it becomes noise.

I prefer to define things in one place and to have the compiler
propagate the information to wherever it is needed. Excepting
interfaces to "foreign" objects which otherwise would be unknown to
the compiler, I consider the redundant declarations required by many
languages to be error-prone unnecessary crap.

How exactly should a compiler "enforce consistency" of corresponding
declarations in separate compilation units? How is the compiler even
to know that they should be corresponding?

>(3) Indentation helps the reader if it is consistent: so should also
>be enforced by the compiler. In line with point (2), nesting should
>be defined by *both* indentation *and* grouping keywords:
>with the compiler enforcing consistency between the representations.
>This means that the reader can rely on either indication of nesting,
>since both are reliable.

Under NO circumstances should a compiler be enforcing coding styles.

Significant indentation is the dumbest idea since paper tape.

Ergonomics dictates that the screen dimensions, fonts and indentation
units that I choose for my viewing comfort should have absolutely no
effect on your choices.

Indentation does aid human readers and should be applied consistently
... by means of a structural editor. Given a properly designed
language in which control constructs expect a demarcated list of
statements, then matching scopes will have matching indentation and
structural mistakes in the code will be apparent due to statement

Python got many things right, but the whole idea of significant
indentation in a modern language is a non-starter. I understand why
Guido thought it might be a good idea, but he really should have left
it alone.


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.