|Run time optimizations firstname.lastname@example.org (Sanjay Jinturkar) (1993-04-20)|
|Re: Run time optimizations email@example.com (1993-04-22)|
|Re: Run time optimizations firstname.lastname@example.org (1993-04-23)|
|Re: Run time optimizations email@example.com (1993-04-23)|
|Re: Run time optimizations firstname.lastname@example.org (1993-04-24)|
|Re: Run time optimizations email@example.com (1993-04-28)|
|From:||Sanjay Jinturkar <firstname.lastname@example.org>|
|Keywords:||optimize, question, comment|
|Organization:||University of Virginia Computer Science Department|
|Date:||Tue, 20 Apr 1993 02:10:16 GMT|
Compilers generate very conservative code. In absence of some information,
the compiler assumes the worst case and generates the code accordingly.
How about generating two pieces of code - one conservative and the other
with some aggressive optimizations, and then making a check at run
time(about the information that was missing at compile time) to see which
piece of code should be executed. An example use of such technique could
be in doing an optimization which would be safe only in absence of
aliasing. The aliasing information could be checked at run time and
appropriate pice of code could be executed. Will such techniques pay? Is
there some previous work in this area? If yes, could someone give some
pointers to such work..
Thanks in advance.
[People have done this from time to time. The HP3000 APL system in the late
1970s generated code on the fly. The first time you ran a function, it
generated very optimistic code that assumed that the arguments to the
function would always be of the same type and shape as they were on the first
call, with "signature" code at the beginning to check that the assumptions
were satisfied. If not, it fell back into the compiler which generated
slower but more general code. It worked pretty well considering how slow
the underlying machine was. -John]
Return to the
Search the comp.compilers archives again.