|The melting ice technology (1): compilers & interpreters email@example.com (1994-05-09)|
|The melting ice technology (2): levels firstname.lastname@example.org (1994-05-09)|
|Re: The melting ice technology (2): levels email@example.com (1994-05-12)|
|Re: The melting ice technology (2): levels firstname.lastname@example.org (1994-05-13)|
|Re: The melting ice technology (2): levels email@example.com (1994-05-13)|
|Re: The melting ice technology (2): levels firstname.lastname@example.org (1994-05-14)|
|Re: The melting ice technology (2): levels email@example.com (1994-05-16)|
|From:||firstname.lastname@example.org (Paul Prescod)|
|Organization:||University of Waterloo|
|Date:||Fri, 13 May 1994 16:53:30 GMT|
>Come on. The preceeding description of Professionial Eiffel's large grain
>recompilation and small-grain re-interpretation is very interesting, and
>it sounds like a hybrid of recompilation and interpretation techniques.
>But to reach the above conclusion, you have stretched definitions of
>`compile' and `interpret' well beyond common usage to the point of
>sophistry. You have defined them such that all systems are both compilers
>and interpreters, and thus made them meaningless.
Well, pretty much all systems ARE compilers and interpreters. The line
between them is not very clearly drawn. I agree with you, however,
that to say flat out: "Personal Eiffel is compiled" would mislead most
IBM has a similar process for REXX on OS/2. But, they didn't see the
opportunity to turn it into an advantage as ISE does. Instead they call
it "Tokenizing." Or is there a difference that I haven't perceived between
"Melting ICE Tecnnology" and REXX tokenizing?
Return to the
Search the comp.compilers archives again.