Re: Debugging of optimized code

bill@amber.ssd.csd.harris.com (Bill Leonard)
Wed, 1 Feb 1995 21:40:17 GMT

          From comp.compilers

Related articles
[18 earlier articles]
Re: Debugging of optimized code jqb@netcom.com (1995-02-02)
Debugging of optimized code ssimmons@convex.convex.com (1995-02-02)
Re: Debugging of optimized code copperma@grenoble.rxrc.xerox.com (1995-01-30)
Re: Debugging of optimized code wicklund@Intellistor.COM (1995-01-30)
Re: Debugging of optimized code copperma@grenoble.rxrc.xerox.com (1995-01-30)
Debugging of optimized code cliffc@hpl.hp.com (1995-02-02)
Re: Debugging of optimized code bill@amber.ssd.csd.harris.com (1995-02-01)
Re: Debugging of optimized code monnier@di.epfl.ch (Stefan Monnier) (1995-02-03)
Re: Debugging of optimized code ali@N2.SP.CS.CMU.EDU (Ali-Reza Adl-Tabatabai) (1995-02-03)
Re: Debugging of optimized code bill@amber.ssd.csd.harris.com (1995-02-05)
| List of all articles for this month |
Newsgroups: comp.compilers
From: bill@amber.ssd.csd.harris.com (Bill Leonard)
Keywords: optimize, debug
Organization: Harris Computer Systems, Ft. Lauderdale FL
References: 95-01-036 95-01-057
Date: Wed, 1 Feb 1995 21:40:17 GMT

Milton Barber <milt@Eng.Sun.COM> writes:
> I would like to address one tiny little part of this big area, namely
> the issue of "Where am I, in the program?".
>
> Assume we have a very thoroughgoing optimizer, and let's watch some
> one particular statement as it passes through the compiler.
>
> First, the routine containing the statement is inlined in several
> places (and, in some of these places, but not all, some higher-
> order inlining takes place, i.e. the inlined code is again inlined),
> leaving our statement now duplicated in a number of places throughout
> the program, in a number of different program contexts. Furthermore,
> lets assume that our statement contains some references to the
> formal parameters of its containing routine, so, in each of these
> places where it has been inlined, the actual parameters of the call
> have been substituted, thus our statement is now not only duplicated
> in a number of different contexts, it is actually a different
> statement in each of those contexts.
>
> Second, we apply loop-level transformations, such as loop inversion
> or cache-blocking, drastically changing the looping control flow
> in which some of the statement instances are involved.
>
> Third, we apply normal global optimization, which will proceed to
> "spread out" the effects of each instance of our statement via such
> things as cse, forward and backward code motion, etc. Of course,
> since each of the instances of our statement are different and in a
> different context, different effects will apply at each place.
>
> Fourth, we apply loop unrolling and/or software pipelining, (which
> will apply to some of the parts of some of the instances of our
> statement, but probably not all parts, or all instances, because of
> the differing environments).
>
> Fifth, we apply various control-flow rearrangement schemes, say
> for example, tail splitting, so there is duplication of some parts
> of some instances of code derived from the statement.
>
> Sixth, we apply global scheduling, which will take the instructions
> of the program and move them around among the basic blocks of the
> program, possibly further changing the control-flow and possibly
> introducing compensation code.
>
> Seventh, we apply code order selection, which might result in
> "later" parts of code occurring earlier in the program than "earlier"
> parts of code.


This is a very nicely-written synopsis of the problem.


One simplification one can make, which we make in our compilers, is to
sever the connection between an expression and the statement it originated
in for such optimizations as code motion. In other words, if a statement such
as


        a = (b + c)/d


occurs in a loop, and the expression (b + c) is moved out of the loop, the
code evaluating (b + c) is no longer considered part of the assignment
statement. (In our compilers, it would be considered part of the loop
prologue and thus probably be associated with the loop header statement.)


We adopted this convention because of the problems mentioned by Mr. Barber.
The rationale is that once an expression has been "commonized", the code
can no longer be said to belong to just one statement, but possibly many.


Of course, this means that when execution actually reaches what the
debugger considers the beginning of a statement, part of that statement may
already have been evaluated. We consider that this is less of a problem
than having execution "bounce around" in a surprising way.


As for the issues of inlining, loop unrolling, and other transformations
that make complete copies of a statement, the best solution (I think) is to
record the existence of multiple instances of that statement. If you ask
the debugger to set a breakpoint on that statement, it should really set a
breakpoint on every separate instance (our debugger does this), but to the
user the debugger should make it appear that there is only a single
breakpoint.


Once you do this, any problems associated with applying different
optimizations to different copies of the statement instance go away. If
you can handle the application of those optimizations to a single instance,
then you can handle multiple instances.


Also, the notion of having multiple instances of a statement helps in
dealing with code scheduling, if you are careful to annotate which code
comes from which instance. In most cases, simply marking the "earliest"
instruction of each instance as the start of the statement will suffice,


Loop transformations such as loop inversion are not so much a problem as
far as "where am I" questions, but do create problems with determining the
values of variables. At the very least, the compiler and debugger should
cooperate to tell the user that the variable in question cannot be
examined. Another poster indicated that the Convex debugger can synthesize
the values of some variables. That is quite nice, but it begs the question
of what to do if the user directs the debugger to change the variable's
value!


> If a loop containing some instance of the statement has been pipelined,
> parts of the statement instance from different iterations of the loop
> might be active simultaneously


True, but this mostly affects what one should report as the value of the
loop control variable. In such a case, the best answer is "I don't know".
I don't see that it affects the notion of where one is in the program.


> Furthermore, you can't even
> think of a statement instance as occupying some region of the code,
> from some start address to some ending address, because of the code
> order selection.


True, but any given program address can be associated with a statement
instance (or instances, if one wishes to be fanatic with common
sub-expression elimination and code motion).


> All of the optimizations discussed here occur in commercial compilers
> and have sufficient payoff that they are worth doing in any compiler
> aimed at a high-performance market. You cannot solve the problem of
> debugging a program passed through a highly optimizing compiler
> by telling the compiler writer he was wrong to put in all of these
> optimizations. The compiler writer just won't listen and will
> respond to the commercial pressures and put them in. Instead, you
> have to try to do SOMETHING to help the poor user faced with the need
> to debug a highly optimized program.


Nor can the compiler writer tell the user that he should just turn off the
optimizations to debug the program. This is often not feasible (the
program does not run in a reasonable amount of time, or doesn't fit in
memory, or ...), not realistic (the bug doesn't occur in unoptimized code),
or not practical (recompilation takes far too long).


> I think a key part of a solution to this problem is recognition of
> the fact that, above a certain level of optimization, the very idea
> of a positional correspondence between the original source program
> and the object program is broken so many transformations have
> occurred to the program that any attempt to utilize this
> correspondence will mislead more than it will help.


I'm not sure I would agree that a positional correspondence is broken, but
the notion that the program is evaluated in a way that directly corresponds
to the way it is written is broken. That is, just because one writes "a =
(b+c)/d" doesn't mean that the program will consist of an addition, a
division, and a data move, in roughly that order.


Hence, some sort of higher-level concepts of debugging are needed in order
to talk about such highly-optimized programs in a way that is less coupled
to the original source form.


--
Bill Leonard
Harris Computer Systems Corporation
2101 W. Cypress Creek Road
Fort Lauderdale, FL 33309
Bill.Leonard@mail.hcsc.com
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.