|What is the future of Compiler ? email@example.com (blerta bishaj) (2006-06-12)|
|Re: What is the future of Compiler ? firstname.lastname@example.org (2006-06-15)|
|What is the future of Compiler ? email@example.com (Inderaj Bains) (2006-06-15)|
|Re: What is the future of Compiler ? firstname.lastname@example.org (Eric) (2006-06-15)|
|Re: What is the future of Compiler ? email@example.com (Oliver Bandel) (2006-06-19)|
|Re: What is the future of Compiler ? firstname.lastname@example.org (Friedrich Dominicus) (2006-06-22)|
|Re: What is the future of Compiler technology? email@example.com (Tommy Thorn) (2006-07-05)|
|Re: What is the future of Compiler technology? firstname.lastname@example.org (2006-07-06)|
|Re: What is the future of Compiler technology? email@example.com (Eliot Miranda) (2006-07-19)|
|What is the future of Compiler ? firstname.lastname@example.org (1995-12-09)|
|Date:||15 Jun 2006 15:01:19 -0400|
blerta bishaj wrote:
> Can you give me any hints as to where compilers are heading?
This is easiest if you consider the history of where compilers and
parsing have come from. Consider any trends and extend them forward as
a projection of the future.
It started with huge multi-pass designs for COBOL and FORTRAN. The
internal design of the parser was developed specifically to work around
the small amount of memory.
Then we moved into block-structured languages (algol, pascal, and even
C). The parsers needed different algorithms. Also, more memory was
avaliable. Parsing adapted to the change in language design, and to the
Then we moved into OOP: C++, Java, .NET. Again, parsers had to change
to work with the new metaphors. We have even more memory now, so we
want to leverage the extra memory by making a fatter design model. If
computers can do more work, developers can do less work. This makes
more maintainable systems and improves developer productivity.
You need to research exactly how parsers have changed at each stage,
and then formulate ideas for the future.
Another thing to consider, some IDEs now use parsers to validate
syntax/semantics as programs are entered. This started with legacy
Quickbasic, but QB was using it to do part of compilation as users
entered each line (because PCs were slow). But now it's only done to
help developers find errors, since actual compilation is pretty fast.
Another thing to consider: functional languages have a different
metaphor. Where have these come from and where are they going? How are
they different from imperative languages? Or, to broaden the topic,
dynamic languages in general are quite different from classic
languages, and they become more attractive as computers get faster, and
have more memory.
I see languages forking to some extent, but also they will take up
benefits of different models and use them. For example, C# 3.0 will use
lamda functions in some ways. This is a bridging of 2 seemingly
incompatible models. Your task can be exciting, but you need to study
well before you jump to any conclusions.
Return to the
Search the comp.compilers archives again.