nuances of Copy Propagation? (& Lambda Calculus)

mark@omnifest.uwm.edu (Mark Hopkins)
Mon, 7 Nov 1994 08:32:14 GMT

          From comp.compilers

Related articles
nuances of Copy Propagation? johnmce@world.std.com (1994-11-03)
nuances of Copy Propagation? (& Lambda Calculus) mark@omnifest.uwm.edu (1994-11-07)
Re: nuances of Copy Propagation? (& Lambda Calculus) steck@dcs.ed.ac.uk (Paul Steckler) (1994-11-11)
| List of all articles for this month |

Newsgroups: comp.compilers
From: mark@omnifest.uwm.edu (Mark Hopkins)
Keywords: optimize
Organization: Omnifest
References: 94-11-028
Date: Mon, 7 Nov 1994 08:32:14 GMT

>From johnmce@world.std.com (John McEnerney):
> I'm implementing the Copy Propagation optimization as described in the
> new Dragon Book, pp. 636-638, and I've run into an interesting problem
> motivated by the following case:
>
> y=x; z=y; x=3; w=z+100;


The best way to answer questions involving this is to realise that
transformation rules in imperative languages are really consequences of
the one rule of the Lambda Calculus: beta reduction. For the following
discussion, think of this sequence as being following by some context
involving the variables, to be denoted as follows:


                  y = x; z = y; x = 3; w = z + 100; (...w, x, y, z...)


      Pretend the whole sequence is a series of functional programming let
statements, all followed by the tuple: (w, x, y, z):


                let y = x in
                let z = y in
                let x = 3 in
                let w = z + 100 in (w, x, y, z)


This is none other than the Lambda expression (where I'm using backslashes
for lambdas):


                (\y. (\z. (\x. (\w.( w, x, y, z)) (z + 100) ) 3 ) y) x


and Beta reduces as follows (retaining the let syntax):


        --> let y = x in
                let z = y in
                let x = 3 in (z + 100, x, y, z)


        --> let y = x in
                let z = y in (z + 100, 3, y, z)


        --> let y = x in (y + 100, 3, y, y)


        --> (x + 100, 3, x, x)


which is, in turn, beta equivalent to the sequence:


                let y = x in let z = x in
                let w = x + 100 in
                let x = 3 in (w, x, y, z)


which can be generated by the assignment sequence:


                y = z = x; w = x + 100; x = 3; (...w, x, y, z...)


which, in turn, is equivalent to the original sequence.


> The dataflow computation tells me that I can propagate 'x' to 'z=y', and
> that I can propagate 'y' to 'w=z+100', but obviously I can't eliminate
> both 'y=x' and 'z=y' because of the assignment to x. So I can choose one
> or the other:
>
> z=x; x=3; w=z+100; -OR- y=x; x=3; w=y+100;


In terms of the Lambda expression:


                let y = x in
                let z = y in
                let x = 3 in
                let w = z + 100 in (w, x, y, z)


"x can be propagated to z = y" means simply that the Beta reduction


                \y. (let z = y in let x = 3 in...) x


        -> (let z = x in let x = 3 in...)


substitutes x for all *free* occurrences of y below. In dataflow terms,
a free occurrence is any occurrence of a variable preceding its redefinition.


There's a clause in the usual definition of the substitution operation in
any standard formulation of the Lambda calculus that takes into account the
name clash above. When x is substituted for y, it is also being substituted
down below in the tuple (w, x, y, z), which slips it in under the "x" in
the assignment "x = 3". The Lambda expression corresponding to this assignment
has the form:


                      ( \x. (\w.(w, x, y, z)) (z + 100) ) 3


To substitute a y for x here, you first have to rename the variable x since it
is bound to a lambda. So you get something like this:


                            \v. (\w.(w, v, y, z)) (z + 100) ) 3


and then after substitution


                            \v. (\w.(w, v, x, z)) (z + 100) ) 3


Renaming bound variables is called Alpha conversion and preserves equivalence.


      What this ultimately translates to in the original assignment sequence is
the following reduction and substitution:


                y = x; z = y; x = 3; w = z + 100; (...w, x, y, z...)


        --> z = x; v = 3; w = z + 100; (...w, v, x, z...)


A similar process yields the reduction:


        --> v = 3; w = x + 100; (...w, v, x, x...)
        --> w = x + 100; (...w, 3, x, x...)
        --> (...x+100, 3, x, x...)


which is basically the same thing as we got before.


      Now pointers are an interesting story in themselves. In reducing the
sequence


                        x = 0; y = x; x = 1; (...x, y...)


you'd expect to get
                                                                  (...1, 0...)


and you do. But, with the pointer sequence


                      *x = 0; y = x; *x = 1; (...*x, *y...)


you'd get
                                                                    (...1, 1,...)


In functional programming terms, the sequence above would be translated to
the form:


                    let *(x) <- 0 in
                    let y = x in
                    let *(x) <- 1 in (*(x), *(y))


where
                    let f(x) <- y in E


denotes the lambda expression


                    let f = (\z. (z == x)? y: f(z)) in E


which "updates" the function f so that it has value y at x.


      Then this sequence will reduce as follows:


                    let *(x) <- 0 in
                    let y = x in
                    let *(x) <- 1 in (*(x), *(y))


              = let *(x) <- 0 in
                    let y = x in
                    let * = (\z.(z == x)? 1: *(z)) in (*(x), *(y))


          --> let *(x) <- 0 in
                    let y = x in (1, (y == x)? 1: *(y))


          --> let *(x) <- 0 in (1, (x == x)? 1: *(x))


          --> let *(x) <- 0 in (1, 1)


          --> (1, 1)


      In terms of the latter portion of the original assignment sequence, this
translates to the
                                        *x = 1; (...*x, *y...)
                              --> (...1, (x == y)? 1: *y...)


Note the reduction of *y to a conditional.


      While loops are another intersting story in themselves, but that's for
another time...
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.