Re: Possible data allocation and instruction scheduling algo...

ollanes@pobox.com (Orlando Llanes)
16 Feb 2002 01:16:12 -0500

          From comp.compilers

Related articles
Possible data allocation and instruction scheduling algo... ollanes@prodigy.net (Orlando Llanes) (2002-01-24)
Re: Possible data allocation and instruction scheduling algo... bryan.hayes@hayestechnologies.com (2002-01-30)
Re: Possible data allocation and instruction scheduling algo... bear@sonic.net (Ray Dillinger) (2002-02-06)
Re: Possible data allocation and instruction scheduling algo... ollanes@pobox.com (2002-02-16)
Re: Possible data allocation and instruction scheduling algo... ollanes@pobox.com (2002-02-16)
| List of all articles for this month |
From: ollanes@pobox.com (Orlando Llanes)
Newsgroups: comp.compilers
Date: 16 Feb 2002 01:16:12 -0500
Organization: http://groups.google.com/
References: 02-01-118 02-01-161
Keywords: architecture, code
Posted-Date: 16 Feb 2002 01:16:12 EST

bryan.hayes@hayestechnologies.com (Bryan Hayes) wrote
> Regarding instruction scheduling: You are referring to the Pentium
> ... advice).


        You've piqued my curiousity at what has changed. The last
optimization doc I read was Agner Fog's doc. Most pentium rules,
however, still apply. I've tested my optimizations on every generation
from the AMD 486DX4 to the K7 Athlon classic. As far as Intel goes,
I've tested from the 486DX4 to the Pentium, the Pentium II, and some
on the PIII. I once got about a 40% increase simply by swapping two
instructions :)
        In all my testing, I've noticed that avoiding AGI stalls and
register collisions still provide a significant speed increase.
Supposedly, the Pentium Pro and higher have register aliases,
supposedly it takes care of register collisions, but my testing has
indicated otherwise.




> Well, this depends very much on your definition of "artificial
> ... extent.


        I'm talking Genetic Algorithms, Neural Nets, and possibly even
Fuzzy Logic. This would significantly reduce the number of lines of
code in the compiler. If you can teach the compiler to think like a
human, maybe it can optimize like one :)
        Artificial Intelligence ("ai") is already in use today. I've heard
of a program that, thanks to Genetic Programming, can create a
schedule better than a human can without any conflicts. Certain
algorithms (not sure which ones) are in use to predict future stock
values. It's only a matter of time before they're used in a compiler
IMO.




> Anyhow, no compiler matches an experienced human assembly programmer
> (or even comes close, at least in most cases).


        Definitely not, however, I'm sure that with some sort of "ai" it
can come closer than it is now. I'm not saying that we should lose the
ability to code in asm, I'll be furious if I can't code in asm as I'm
certain others will be. All I'm saying is that if someone would try to
deviate from "standard practice" something better can be achieved.
        The great thing about "ai" is that certain algorithms adapt. Since
there are lots of constraints, perhaps a Genetic Algorithm (Genetic
Program for the purists :P) can be used. Genetic Algorithms try lots
of different combinations until it finds a winning combination, so the
rewards could be faster execution and nicely scheduled code. Perhaps
the only thing that would have to be changed for each platform is the
back end, plus a config file listing the target's constraints.




See ya!
Orlando


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.