|Machine Learning in Optimization ? email@example.com (Stephan Ceram) (2008-06-02)|
|Re: Machine Learning in Optimization ? firstname.lastname@example.org (=?ISO-8859-1?Q?Bj=F6rn_Franke?=) (2008-06-04)|
|Re: Machine Learning in Optimization ? email@example.com (Stephan Ceram) (2008-06-09)|
|Re: Machine Learning in Optimization ? firstname.lastname@example.org (=?ISO-8859-1?Q?Bj=F6rn_Franke?=) (2008-06-10)|
|Date:||Wed, 04 Jun 2008 15:44:56 +0100|
|Posted-Date:||04 Jun 2008 10:44:30 EDT|
[... machine learning in compilers ...]
> Just out of curiosity I was wondering how the machine learning system
> is integrated into a compiler. Is this done by having a data base
> which is extended during learning or are the results stored in a file?
For the particular paper you are referring to the data was collected
in text files containing comma separated values. These files have been
subsequently processed by various machine learning algorithms
implemented in Matlab. However, this is just a technical
implementation issue. Whether you use a relational database or just
some proprietary data repository doesn't really matter. In terms of
interfacing this approach is certainly not the most efficient way of
doing this, though.
> And what free tools are there which can be integrated in a C++
> project? Any powerful C++ libraries?
You may want to look at GCC ICI  (= Interactive Compilation
Interface). Basically, GCC ICI is an initiative to open up the
internal heuristics of GCC and allow an external tool to make the
decisions for GCC whenever it has to decide on whether or not to apply
a transformation or to choose a transformation parameter. So, GCC ICI
is a GCC framework with handles to an external decision making tool
which may or may not be based on machine learning.
In terms of machine learning packages WEKA  may be useful to
you. As WEKA is written in Java you may need to develop your own C++
wrappers, though. Some information on bridging WEKA and .NET can be
found in .
Return to the
Search the comp.compilers archives again.