|How should size of program grow with size of problem? firstname.lastname@example.org (Julian V. Noble) (1993-10-28)|
|Re: How should size of program grow with size of problem? email@example.com (1993-11-03)|
|Re: How should size of program grow with size of problem? firstname.lastname@example.org (1993-11-03)|
|interface reuse over code reuse email@example.com (Steven D. Majewski) (1993-11-04)|
|reusing ADT, not implementation (was Re: How should size ... ?) firstname.lastname@example.org (1993-11-04)|
|Re: reusing ADT, not implementation (was Re: How should size ... ?) email@example.com (1993-11-05)|
|Re: interface reuse over code reuse firstname.lastname@example.org (1993-11-05)|
|[2 later articles]|
|From:||"Julian V. Noble" <email@example.com>|
|Organization:||University of Virginia|
|Date:||Thu, 28 Oct 1993 20:44:55 GMT|
Is there any literature on how program size should vary with task "size"?
This question is prompted by my observations re: recent commercial apps
such as MathCad that seem to be growing exponentially with time. My
intuition says that MathCad (and lots of other programs I use) are at
least 10x longer than they need to be.
Forgetting all the truly superfluous code found in apps like Windows (I
read that if one pushes a magic button out come little dancing men waving
banners; I sure did not pay good money to have my disk cluttered with
crap!), nevertheless these programs are longer than they should be and
demand too much swap space and RAM for what they actually do.
Having looked at the Windows API in the bookstore, I found lots of
function names like
I would bet that one reason for the vast panorama of Windows API functions
is that large numbers of them differ only slightly in what they do. That
is, the application is poorly factored. Without access to the C (C++ ?)
source for MathCad, I cannot guarantee this is also true of the latter;
but it is certainly true of Ventura Publisher (even the DOS version) and
other programs I have examined.
Factoring programs is a little like finding the basis of a vector space.
That is, one wants to find a minimal set of operations that will accom-
plish the task at hand. In some sense the operations of this set will have
zero overlap (or at least, *** small *** overlap -- nobody's perfect!),
just like the vectors of an orthonormal basis. Of course, any set of > N
vectors can span an N-dimensional space, it's just that they will then
overlap; and the more vectors there are, the more there will be that are
nearly identical (almost 100% overlap).
In other words, I think the above analogy explains to some extent the
exasperating and uncontrolled growth of applications programs. Object-
oriented and modular environments designed to coordinate the efforts of
large programming teams would, I should imagine, foster the creation of
over-complete sets of functions or modules in a given program, because the
modules never get to use each others' components.
The next question would be how to relate the complexity of an application
to the size N of the "vector space" alluded to above. My guess is that N
grows *** slower *** than the application per se (ex: natural language)
perhaps only logarithmically. In that case, it should be possible to
control program size, possibly by "orthogonalizing" its set of operations,
perhaps even by some automated process.
OOP lets you reuse your code. That's the good part. The bad part is, you
reuse your code. ;-)
Please reply by e-mail unless there is a vast literature you want to
call to everyone's attention.
Julian V. Noble
Return to the
Search the comp.compilers archives again.