Re: Optimization techniques
Thu, 26 Sep 2019 20:35:19 -0700 (PDT)

          From comp.compilers

Related articles
[34 earlier articles]
Re: Optimization techniques (David Brown) (2019-05-01)
Re: Optimization techniques (Martin Ward) (2019-05-02)
Re: Optimization techniques (Kaz Kylheku) (2019-05-02)
Re: Optimization techniques (Kaz Kylheku) (2019-05-02)
Re: Optimization techniques (Robin Vowels) (2019-05-07)
Re: Optimization techniques (David Brown) (2019-05-07)
Re: Optimization techniques (2019-09-26)
| List of all articles for this month |

Newsgroups: comp.compilers
Date: Thu, 26 Sep 2019 20:35:19 -0700 (PDT)
Organization: Compilers Central
References: 19-04-004
Injection-Info:; posting-host=""; logging-data="92550"; mail-complaints-to=""
Keywords: optimize, history
Posted-Date: 27 Sep 2019 01:07:49 EDT
In-Reply-To: 19-04-004

On Wednesday, April 17, 2019 at 8:42:24 AM UTC-5, Rick C. Hodgin wrote:
> Are there resources someone can point me to for learning more about
> time-honored, long-established, safely applied, optimization
> techniques for a C/C++ like language?

cparser uses libfirm as an optimization engine.

Get your hands dirty, experimenting with it; while searching links on each of
the methods that libfirm uses (e.g. looking up 'common subexpression
elimination' )
when it mentions it. A master list of sorts can be found here ( ).

> I'm walking the abstract syntax tree and am able to find many kinds of
> optimizations, but I would like to learn some theory or pitfalls of
> various types of optimizations applied.

One kind of optimization it does NOT do - which I did to the whole cparser
source - is language-level optimization (i.e. refactoring, reengineering,

This is the, by far, more important set of optimizations; since it eliminates
the large, significant accumulation of code debt ( ) that accrues in large
distributions ... and that has seriously plagued the entire GNU codebase ...
when the continual need for refactoring is ignored.

When left untended for too long, a code base transforms into legacy code ( ). Then, we talk about legacy rescue
or legacy code refactoring.

That should also be handled by and integrated within the compiler, as an
additional optional front-end only optimization stage, to whatever extent is
possible. A key ingredient in this process is a nuts and bolts level
Artificial Intelligence tool/method known as formal concept lattice analysis.

"Detecting Software Patterns Using Formal Concept Analysis"

"Revealing Class Structure with Concept Lattices"

All of this should be considered as part of the compilation and optimization
process. Any code analysis/[re-]synthesis is compilation ... even if it
entails automating elements of the task of software engineering, itself.
Refactoring, itself, is the optimization stage of that process.

I'm also experimenting with hybridizing singular value decomposition and
factor analysis with formal concept lattices. For instance, I have a large set
of graphics routines (about 150) that needs to be refactored and reintegrated.
I'm running all 3 types of analyses on it and working my way up to a utility
that can do all 3 in tandem.

Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.