Re: What is the future of Compiler ?

"Eric" <englere_geo@yahoo.com>
15 Jun 2006 15:01:19 -0400

          From comp.compilers

Related articles
What is the future of Compiler ? blertadn@yahoo.com (blerta bishaj) (2006-06-12)
Re: What is the future of Compiler ? torbenm@app-1.diku.dk (2006-06-15)
What is the future of Compiler ? inderaj@gmail.com (Inderaj Bains) (2006-06-15)
Re: What is the future of Compiler ? englere_geo@yahoo.com (Eric) (2006-06-15)
Re: What is the future of Compiler ? oliver@first.in-berlin.de (Oliver Bandel) (2006-06-19)
Re: What is the future of Compiler ? frido@q-software-solutions.de (Friedrich Dominicus) (2006-06-22)
Re: What is the future of Compiler technology? tommy.thorn@gmail.com (Tommy Thorn) (2006-07-05)
Re: What is the future of Compiler technology? torbenm@app-4.diku.dk (2006-07-06)
Re: What is the future of Compiler technology? eliotm@pacbell.net (Eliot Miranda) (2006-07-19)
What is the future of Compiler ? pschen@casd2.iie.ncku.edu.tw (1995-12-09)
| List of all articles for this month |
From: "Eric" <englere_geo@yahoo.com>
Newsgroups: comp.compilers
Date: 15 Jun 2006 15:01:19 -0400
Organization: http://groups.google.com
References: 06-06-044
Keywords: history
Posted-Date: 15 Jun 2006 15:01:19 EDT

blerta bishaj wrote:
> Can you give me any hints as to where compilers are heading?


This is easiest if you consider the history of where compilers and
parsing have come from. Consider any trends and extend them forward as
a projection of the future.


It started with huge multi-pass designs for COBOL and FORTRAN. The
internal design of the parser was developed specifically to work around
the small amount of memory.


Then we moved into block-structured languages (algol, pascal, and even
C). The parsers needed different algorithms. Also, more memory was
avaliable. Parsing adapted to the change in language design, and to the
improved hardware.


Then we moved into OOP: C++, Java, .NET. Again, parsers had to change
to work with the new metaphors. We have even more memory now, so we
want to leverage the extra memory by making a fatter design model. If
computers can do more work, developers can do less work. This makes
more maintainable systems and improves developer productivity.


You need to research exactly how parsers have changed at each stage,
and then formulate ideas for the future.


Another thing to consider, some IDEs now use parsers to validate
syntax/semantics as programs are entered. This started with legacy
Quickbasic, but QB was using it to do part of compilation as users
entered each line (because PCs were slow). But now it's only done to
help developers find errors, since actual compilation is pretty fast.


Another thing to consider: functional languages have a different
metaphor. Where have these come from and where are they going? How are
they different from imperative languages? Or, to broaden the topic,
dynamic languages in general are quite different from classic
languages, and they become more attractive as computers get faster, and
have more memory.


I see languages forking to some extent, but also they will take up
benefits of different models and use them. For example, C# 3.0 will use
lamda functions in some ways. This is a bridging of 2 seemingly
incompatible models. Your task can be exciting, but you need to study
well before you jump to any conclusions.



Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.