Re: On Legacy Applications and Previous Work

steve@cegelecproj.co.uk (Steve_Kilbane)
Tue, 22 Mar 1994 09:11:42 GMT

          From comp.compilers

Related articles
On Legacy Applications and Previous Work PAUL@TDR.COM (Paul Robinson) (1994-03-06)
Re: On Legacy Applications and Previous Work donawa@bnr.ca (chris (c.d.) donawa) (1994-03-21)
Re: On Legacy Applications and Previous Work bill@amber.csd.harris.com (1994-03-14)
Re: On Legacy Applications and Previous Work baxter@austin.sar.slb.com (1994-03-16)
Re: On Legacy Applications and Previous Work steve@cegelecproj.co.uk (1994-03-22)
Re: On Legacy Applications and Previous Work bart@cs.uoregon.edu (1994-03-23)
Re: On Legacy Applications and Previous Work pardo@cs.washington.edu (1994-03-24)
Re: On Legacy Applications and Previous Work bill@amber.csd.harris.com (1994-03-25)
Re: On Legacy Applications and Previous Work mboucher@silver.sdsmt.edu (1994-03-29)
Re: On Legacy Applications and Previous Work bill@amber.csd.harris.com (1994-04-04)
| List of all articles for this month |

Newsgroups: comp.compilers
From: steve@cegelecproj.co.uk (Steve_Kilbane)
Keywords: tools
Organization: Compilers Central
References: 94-03-058
Date: Tue, 22 Mar 1994 09:11:42 GMT

Bill Leonard <bill@ssd.csd.harris.com> writes:
> Reusing tiny pieces of code is usually not worth it. As Paul points out,
> you usually end up changing it anyway, or if you don't, you had to spend
> longer looking for it than it would take you to write it. The larger the
> database, the longer the lookup time.
>
> Reusing large pieces of code is better. But writing reusable software
> that does something complicated enough to justify reuse is hard, real hard
> -- and time-consuming. Suppose you go to your manager and say, "I can
> have it done in 3 months, or a year if you'll let me make it reusable."
> I'll give you one guess which way the manager is likely to vote.


Sadly, this is true. However, I'd like to recount something included in a
presentation by Les Hatton of Programming Research Ltd, at the '94 UKUUG
conference. It seems that PRL have been invovled in projects which attempt
to measure improved quality in a concrete fashion, by incrementally
extending an enforced programming standard, and recording the number of
bugs found in the software. Although not mentioned in the presentation, I
presume that the project involved the use of PRL's quality-checking
toolset, which allows you to detect static problems with your software -
it's like lint, with more checks.


Anyway, at the start of the project, a very loose standard was set, and
all existing software was measured against it. Each time a file was
checked out and edited, the conformance had to be better when the file was
checked back in. As all the files conformed, the standard was made more
strict, and the cycle repeated. Entirely new software had to conform to
the current standard immediately.


Now the point: one of the unexpected side-effects of the process was that
a high level of software reuse was occurring. The reason turned out to be
that it was such an effort to get entirely new software through the
now-strict standard that the programmers were preferring to reuse code
that had already passed.


There are problems with this approach, of course (the first that comes to
mind is that re-use should start at the design level, not at the coding
level) but still, it's worth thinking about.


Steve
--
<Steve_Kilbane@cegelecproj.co.uk>, <Steve_Kilbane@gec-epl.co.uk>
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.