Related articles |
---|
Compilation timing and quality eugene@eos.arc.nasa.gov (1989-09-09) |
From: | eugene@eos.arc.nasa.gov (Eugene Miya) |
Newsgroups: | comp.compilers |
Date: | 9 Sep 89 00:23:05 GMT |
Organization: | NASA Ames Research Center, Calif. |
I wish to ask the readership a question of intuition. Please
send your answers to me via mail. Do not post. I assume that you have
either 1) have written or maintain a compiler or 2) do a LOT of
compiles with different compilers, etc.
I am working part time on some tools for performance measurement
and compiler quality. Along the way, I have created some basic
testing tools. One such tool I've written, I call: CONTROL.
It's a program generator, part of a benchmark generator yet to be finished.
Now compilers as we all know are reasonably complex programs. And they
have parts which take up different amount of execution time. But the
question I want to ask is as follows.
Suppose I gave you a 10,000 line program and it compiled in time X.
Then I gave you a 20,000 line program of similar composition
(yes I know I am being vague, but that's the point I want to survey you
intuitive feel). Would you expect this 20,000 line program to compile
twice as long (2X), or >2X, or <2X? No fair running off and trying this.
Yes I am over-simplifying analysis, parsing, optimization, and code
generation. Let's just assume the character of 10K is the "same" as 20K.
Are your compilers "linear" in their behavior? Or are they optimal?
Or are they expensive to run?
2X? <2X? >2X?
Let me know. Intuition only please.
Another gross generalization from
--eugene miya, NASA Ames Research Center, eugene@aurora.arc.nasa.gov
resident cynic at the Rock of Ages Home for Retired Hackers:
"You trust the `reply' command with all those different mailers out there?"
"If my mail does not reach you, please accept my apology."
{ncar,decwrl,hplabs,uunet}!ames!eugene
Live free or die.
Return to the
comp.compilers page.
Search the
comp.compilers archives again.