Newsgroups: | comp.compilers |
From: | chase@centerline.com (David Chase) |
Keywords: | C++, optimize |
Organization: | CenterLine Software |
References: | 95-08-067 95-08-075 |
Date: | Fri, 11 Aug 1995 14:35:56 GMT |
David Toland <det@sw.stratus.com> writes:
> Having to assume a strict left to right sequencing
> on operation involving side effects can greatly affect the degree of
> optimization possible, particularly on compilers for RISC
> architectures which place a high premium on maintaining values in
> registers as much as possible.
You state this as fact. This fact has been measured where? In the
practice that I am aware of, RISC optimizers DO NOT reorder the
evaluation of side-effect expressions -- the order is chosen (perhaps at
random) by the front-end, and that is the end of the story. This is the
third time that I have repeated a request for any references where this
has been measured, and so far the only one I've seen has been from Will
Clinger, describing Scheme/Lisp optimization (7.5% code size reduction).
Yet, your claim has been dogma since before Scheme even existed. Surely,
in all that time, someone MUST have measured the benefit. Are we
engineers and scientists, or just a bunch of guys playing software air
guitar? This unspecified order has a cost -- software may change its
behavior from platform to platform, from compiler to compiler, even from
one set of flags to another. I thought that we were supposed to be
worrying about reliability, quality, portability, maintainability, and
productivity, not the last scrap of performance. (Where's Ralph Nader
when you need him?)
And, it is entirely true that code should not be written to depend on
such tricky things, because it is, in fact, difficult to maintain.
However, it has always been difficult (or at least obnoxious) to
manually ensure that code is free of such order-dependence, and this has
only been made worse by many of the features in C++ (and other languages
as well). Templates/generics, virtual function dispatch, implicit
conversions, and operator overloading all make it more difficult to
ensure that code is free of any dependence on order of evaluation.
> Keep in mind that to maintain strict left to right (or any other
> canonical) semantics on side effects means that any operation which
> MIGHT result in a side effect must then produce a sequence point,
> resulting in smaller optimizable chunks and fewer opportunities for
> instruction reordering.
And, in fact, this is already the case in the optimizers that I have
observed/read about.
> In the case of a seemingly esoteric change like specifying the order
> of evaluation of operands, the cost to users should be 0 (any code
> that would be affected adversely is already incorrect and subject to
> failure), but the cost to existing compiler implementations is
> potentially staggering.
I disagree. At least, the way I learned to write a compiler, and the way
that the compilers that I know of (Unix workstation compilers, mostly)
are written, this would be a very minor change. The order chosen by the
front-end would change, and the optimizers would continue to respect that
order, as they do now. I'm also certain that somewhere out there is a
poorly written compiler, for which this would be a terrible burden. So
what? All users should suffer, because of a few lazy implementors? This
makes no sense to me at all.
Everyone has to rip their compilers apart every time the committee meets,
anyhow, and the vast majority of them (all but ONE that I've tried)
exhibit entertaining or embarrassing bugs. They've got to be fixed
anyway, why not fix them right?
speaking for myself,
David Chase
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.