|Re: 'Superoptimizers' email@example.com (1995-11-09)|
|Re: 'Superoptimizers' firstname.lastname@example.org (1995-11-14)|
|Re: 'Superoptimizers' email@example.com (1995-11-15)|
|Re: 'Superoptimizers' firstname.lastname@example.org (1995-11-17)|
|Re: 'Superoptimizers' email@example.com (1995-11-20)|
|Re: 'Superoptimizers' firstname.lastname@example.org (1995-11-21)|
|Re: 'Superoptimizers' email@example.com (1995-11-22)|
|Re: 'Superoptimizers' firstname.lastname@example.org (1995-11-23)|
|[2 later articles]|
|From:||email@example.com (Christopher Glaeser)|
|Date:||Tue, 14 Nov 1995 03:48:56 GMT|
> From: firstname.lastname@example.org (Thomas Womack)
> Are present-day compilers constrained in any way by the amount
> of time that people are prepared to wait for a compile normally?
Absolutely. Many optimization algorithms are quite simple if compile
time is not an issue, and are considerably more complex when
heuristics, additional data structures, and other techniques are used
to reduce the compile time. These heuristics often discover a local
minimum, which means that the ideal solution remains undiscovered due
to limited search time.
During the performance analysis phase, NULLSTONE collects both
compile-time and run-time statistics for each of the performance
tests, and is effective at isolating some types of compile-time
performance problems. This is an important feature, because
marketing surveys reveal that compilation speed is an important
factor when programmers purchase a compiler.
Faster machines, larger caches, and larger memories all help with
compilation speed issues, but expontential algorithms are often
avoided, at least by design, unless N is known to be small.
Return to the
Search the comp.compilers archives again.