Related articles |
---|
Re: Non-declared Variables gnorik@gmail.com (2006-10-24) |
RE: Non-declared Variables qtj-query@shaw.ca (Quinn Tyler Jackson) (2006-10-26) |
Re: Non-declared Variables pjb@informatimago.com (Pascal Bourguignon) (2006-10-28) |
From: | Quinn Tyler Jackson <qtj-query@shaw.ca> |
Newsgroups: | comp.compilers |
Date: | 26 Oct 2006 00:28:45 -0400 |
Organization: | Compilers Central |
References: | 06-10-098 |
Keywords: | design |
Posted-Date: | 26 Oct 2006 00:28:45 EDT |
> I don't think that it is a good idea when language definition allows
> you to use variables without declaration.
I think languages should forbid the declaration of variables, for the
following reasons:
Required declaration of variables...
* ... leads to the programmer jumping to conclusions about what a variable
is for.
* Code doesn't need no declaration telling a compiler what it can and cannot
do.
* ... puts the responsibility for correctness on the human beings, and human
beings are fallible.
* Type mismatch errors during compilation inhibit creativity.
* ... because it's redundant to introduce syntactic sugar into a language
when there is already a mechanism in place to state intended use of a
variable: comments.
* ... puts undo weight on variables, bringing them to others' attention and
de-emphasizing the algorithm.
* ... de-emphasizes the Wikipedian notion of "eventualism" ("the program
will eventually be correct if it's important enough to make it so, and a
mass of coders will make it correct one day if it's notable") and emphasizes
"immediatism" ("the program must work correctly NOW or it is useless.")
* ... increases inter-line dependencies: variables declared on line n first
used on line n+j, for some increasing value of j over time as code gets
tweaked.
* ... requires compiler technologies to remember too much information they
could use probabilistic methods to guess at.
* ... gives a false sense of correctness: all code is wrong somewhere, so
why dress it up and present it as being "more correct" -- correctness is an
absolute, not a scale.
* ... is disempowering: who's to say that variable's right to self
determination should be usurped by some declaration? Let it decide its own
eventual fate.
* ... is too prematurely legalistic: when a coder declares what a variable
"is" -- he or she is presuming to know what the meaning of "is" is.
* ... puts too much responsibility on other code for being correct, causing
resentment between lines.
* ... is stifling to system evolution: if someone comes along later and
wants to change a variable's type, he or she must then do an impact study to
see what might break. Since evolving systems are broken anyway, this deters
progress by overemphasizing and dictating that those breaks be in the more
important parts: the system logic, rather than the less important parts: the
administratrivia.
TPFIC
--
Quinn Tyler Jackson
Return to the
comp.compilers page.
Search the
comp.compilers archives again.