Definition of Data Dependence Distance

pugh@cs.umd.edu (Bill Pugh)
6 Dec 91 18:12:24 GMT

          From comp.compilers

Related articles
Definition of Data Dependence Distance pugh@cs.umd.edu (1991-12-06)
Re: Definition of Data Dependence Distance maydan@Neon.Stanford.EDU (1991-12-06)
Re: Definition of Data Dependence Distance frazier@CS.UCLA.EDU (1991-12-06)
Re: Definition of Data Dependence Distance grover@brahmand.Eng.Sun.COM (1991-12-10)
| List of all articles for this month |

Newsgroups: comp.compilers
From: pugh@cs.umd.edu (Bill Pugh)
Followup-To: comp.parallel
Keywords: optimize, parallel
Organization: U of Maryland, Dept. of Computer Science, Coll. Pk., MD 20742
Date: 6 Dec 91 18:12:24 GMT

[Copied from comp.parallel -John]


I've been wrestling with the appropriate exact definition of data
dependence distance (in loops), and haven't come to any definite
conclusions. I've talked about this with a few other researchers, and
nobody seems to have an answer (although most of them have opinions).
I figured I would put the question to the net:


    Does dependence distance represent the difference in the values of
    the loop variables, or the difference in the loop trip counts?


Consider the loop:


    for i := 0 to n by 2
        a[i] := a[i-4] + b[i]


Using the first definition, there would be a flow dependence with
distance 4, while using the second definition there would be a flow
dependence with distance 2.


A more problematical case is:


    for i := 0 to n do
        for j := i to n do
            a[i, j] := a[i-1, j-1] + b[i, j]


Using the first definition, the dependence distance is clearly (1,1).
Using the second definition, it isn't clear: should it be (1,1) or
(1,0)?


Another problematical example:


    for i := n to 0 by -1 do
        a[i] := a[i+3] + b[i]


Using the first definition, there is a flow dependence with distance
-3. while using the second definition the flow dependence is 3.


This would seem to suggest a preference for the second definition,
since it maintains the invariant that all dependence distances are
lexicographical positive. Allowing lexicographical negative
dependence distances would, among other things, force changes in the
definitions of when loop interchange and loop circulation are legal.


HOWEVER, coming to the defense of the first definition is the
following example:


for i := 0 to 10
    for j := i to 10 by 3
      a[i,j] := a[i-1,j+2] + b[i,j]


Using the first definition gives a dependence distance of (1,-2).
However, it isn't clear if the dependence distance is even defined if
we use the second definition.


----


Of course, normalizing all loops avoids the issue. While this may be
the solution of choice for compilers, interactive programming
environments probably don't wish to force all loops to be normalize.




----
Comments?


Bill Pugh
Univ. of Maryland, College Park
pugh@cs.umd.edu
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.