Re: Encripted source as an ANDF

rfg@mcc.com (Ron Guilmette)
Sat, 27 May 89 15:56:59 CDT

          From comp.compilers

Related articles
Re: Encripted source as an ANDF rf@mcc.comg (1989-05-20)
Re: Encripted source as an ANDF albaugh@dms.UUCP (1989-05-23)
Re: Encripted source as an ANDF henry@zoo.toronto.edu (1989-05-24)
Re: Encripted source as an ANDF harvard!cs.utah.edu!esunix!bpendlet (1989-05-24)
Re: Encripted source as an ANDF jeffb@grace.cs.washington.edu (1989-05-24)
Re: Encripted source as an ANDF bpendlet@esunix.uucp (1989-05-24)
Re: Encripted source as an ANDF rfg@mcc.com (1989-05-27)
Re: encripted source as an ANDF rfg@mcc.com (1989-05-27)
Re: Encripted source as an ANDF kbierman@sun.com (1989-05-30)
Re: encripted source as an ANDF henry@zoo.toronto.edu (1989-05-31)
| List of all articles for this month |

Date: Sat, 27 May 89 15:56:59 CDT
From: rfg@mcc.com (Ron Guilmette)
Posted-Date: Sat, 27 May 89 15:56:59 CDT

Recently, albaugh@dms.UUCP (Mike Albaugh) writes:
> >From article <3949@ima.ima.isc.com>, by rf@mcc.comg (Ron Guilmette):
> [ much serious discussion of the _politics_ of uglified source ]
> >
> > Another very effective uglifing transformation is to perform comprehensive
> > de-structuring, i.e. the conversion of all loops and well structured if and
> > switch/case statements into the most degenerative possible equivalent forms
> > using GOTO statements...


> > ... This transformation is also
> > a particularly good candidate for use in an ANDF generator because the
> > semantics of the original source code may be strictly manitained, and
> > the transformation itself should have little or no effect on the quality
> > of the *final* generated (machine-dependent) code (assuming that even
> > modest optimizations take place in the ANDF compiler(s)).
>
> I beg to differ. The sort of transformation suggested here is likely
> to cripple the optimization effort, for much the same reason as cited against
> RTL and the like. If the code is going to be "optimized" by the original
> source->andf translation, assumptions have to be made about the eventual
> target. These assumptions are no better than the RTL ones. If the code is
> supposed to be optimized by the andf->machine_code translation, then the
> control structures and variable scoping need to be preserved so, for example,
> register allocation can be done "intelligently".


Mike talks about two type of "optimizers" here, i.e. SOURCE => ANDF and
ANDF => MACHINE_CODE. One of these possibilities is totally silly, in the
current context.


The real beauty of the simple idea I proposed was that almost everybody
already has a C compiler. In the scheme I suggested, this compiler would
also serve (without major modifications) and the ANDF compiler.


Given this assumption, it should be obvious that there would be no need
whatsoever for a SOURCE => ANDF "optimizer" since the ANDF => MACHINE_CODE
transformation (i.e. "normal" compliation") would (presumably) already
have a good optimizer.


Mike says that even for an ANDF => MACHINE_CODE optimizier, "control
structures and variable scoping need to be preserved so, for example,
register allocation can be done 'intelligently'". Well, he may have gotten
it half right. Scoping information may be useful in this regard, but I
never suggested that any scoping information be destroyed. Consider the
destructuring of:


if (<expression>)
{
<local-variable-declarations>
...
}


into:


if (<expression>)
goto around_999;
{
<local-variable-declarations>
...
}
around_999:


This destructuring transformation obviously *does not* have any effect
on scoping information.


Regarding the other half of Mike's argument (i.e. that "control structures"
must be preserved to do good optimization) I believe that this is also
patently false. I personally know of no reason why this should be the case,
and I challenge Mike to produce some evidence or proof that such information
improves the ability of an optimizer to do its work (either with respect to
register allocation, or with respect to any other type of commonly used
optimization mechanism).


In fact, quite to the contrary, I believe that the vast majority of modern
optimizers begin their analysis by reducing "higher-level" control constructs
down to their simpler "GOTO" equivalents. Thus, if this transformation is
done at the source level, it should have absolutely no effect on the quality
of optimization for most well-written modern optimizers.


Mike seems to be saying that there are some optimizers which perform
specialized optimizations *only* on control-flow graphs derived from
"higher-level" control constructs (e.g. if-then-else, while-do, repeat-while,
for, etc.) and *not* on identical control flow graphs which happen to be
derived from some GOTO-filled programs. I believe that this is wrong, and
that all "good" optimizers look for *all* optimization opportunities wherever
they might be found.


> For example, we have a locally developed "silliness reducer" which
> we use on the output of the GreenHills compiler...
> ... [ description of their post-processor which fixes up lousy GreenHills
> output code ] ...


What does this have to do with anything (other than to demonstrate that
GreenHills compilers need more work)?


> ... Similar problems would crop up in an uglification that re-used
> variables, expecting a specific number to occupy registers.


I never suggested this as a "proper" uglification-step for an ANDF generator
(and I probably never will)! We *were* talking about de-structuring.


// Ron Guilmette - MCC - Experimental Systems Kit Project
// 3500 West Balcones Center Drive, Austin, TX 78759 - (512)338-3740
// ARPA: rfg@mcc.com
// UUCP: {rutgers,uunet,gatech,ames,pyramid}!cs.utexas.edu!pp!rfg
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.