|Code quality email@example.com (1993-01-06)|
|Re: Code quality firstname.lastname@example.org (1993-01-06)|
|Re: Code quality email@example.com (1993-01-06)|
|Re: Code quality firstname.lastname@example.org (1993-01-06)|
|Gcc, Lcc, and 2c email@example.com (Michael John Haertel) (1993-01-07)|
|Re: Code quality firstname.lastname@example.org (1993-01-07)|
|Re: Code quality email@example.com (1993-01-07)|
|Re: Code quality firstname.lastname@example.org (1993-01-07)|
|Re: Code quality email@example.com (1993-01-07)|
|[8 later articles]|
|From:||firstname.lastname@example.org (David Moore)|
|Date:||Wed, 6 Jan 1993 19:24:21 GMT|
email@example.com (Dale R. Worley) writes:
>How important is generated code quality these days? There are a lot of
>good optimization techniques that seem to be adequate for ordinary
>programming. But they still are at least 10% or 20% worse than the ideal.
>Is there much of a market for another 10% in speed of generated code?
It seems to me that compile time is roughly exponential in the deficiency
of the generated code. So, to produce code that is 10% worse than optimal
takes twice as long as it does to produce code 20% less than optimal (if
your compiler is optimizer-bound). I suspect that programmer time required
to get the optimizer solid is also exponential.
So getting that last few percent requires a lot of resources.
Perhaps someone has collected some numbers on this? I am just making the
statement based on a gut feeling gotten from writing an optimizer.
[It varies all over the place. The Princeton/Bell Labs lcc compiler
is supposed to produce better code faster than GCC. Ken Thompson's Plan 9
compiler is supposed to be better still in both dimensions. -John]
Return to the
Search the comp.compilers archives again.