Related articles |
---|
Re: 'Superoptimizers' glew@ichips.intel.com (1995-11-09) |
Re: 'Superoptimizers' cdg@nullstone.com (1995-11-14) |
Re: 'Superoptimizers' bill@amber.ssd.hcsc.com (1995-11-15) |
Re: 'Superoptimizers' theoblit@wam.umd.edu (1995-11-17) |
Re: 'Superoptimizers' nmm1@cus.cam.ac.uk (1995-11-20) |
Re: 'Superoptimizers' hbaker@netcom.com (1995-11-21) |
Re: 'Superoptimizers' jmccarty@spdmail.spd.dsccc.com (1995-11-22) |
Re: 'Superoptimizers' jan@neuroinformatik.ruhr-uni-bochum.de (1995-11-23) |
[2 later articles] |
Newsgroups: | comp.compilers |
From: | cdg@nullstone.com (Christopher Glaeser) |
Keywords: | optimize |
Organization: | Compilers Central |
References: | <47b2fl$d4l@news.ox.ac.uk> |
Date: | Tue, 14 Nov 1995 03:48:56 GMT |
> From: mert0236@sable.ox.ac.uk (Thomas Womack)
> Are present-day compilers constrained in any way by the amount
> of time that people are prepared to wait for a compile normally?
Absolutely. Many optimization algorithms are quite simple if compile
time is not an issue, and are considerably more complex when
heuristics, additional data structures, and other techniques are used
to reduce the compile time. These heuristics often discover a local
minimum, which means that the ideal solution remains undiscovered due
to limited search time.
During the performance analysis phase, NULLSTONE collects both
compile-time and run-time statistics for each of the performance
tests, and is effective at isolating some types of compile-time
performance problems. This is an important feature, because
marketing surveys reveal that compilation speed is an important
factor when programmers purchase a compiler.
Faster machines, larger caches, and larger memories all help with
compilation speed issues, but expontential algorithms are often
avoided, at least by design, unless N is known to be small.
Regards,
Christopher Glaeser
Nullstone Corporation
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.