Re: Current work in compiler/language design.

chambers@cs.washington.edu (Craig Chambers)
Mon, 18 Nov 91 23:36:33 -0800

          From comp.compilers

Related articles
[2 earlier articles]
Re: Current work in compiler/language design. hwloidl@risc.uni-linz.ac.at (1991-11-12)
Current work in compiler/language design. objsys@netcom.com (1991-11-14)
Re: Current work in compiler/language design. preston@dawn.cs.rice.edu (1991-11-16)
Re: Current work in compiler/language design. martens@laurel.cis.ohio-state.edu (1991-11-17)
Re: Current work in compiler/language design. objsys@netcom.com (Bob Hathaway) (1991-11-18)
Re: Current work in compiler/language design. carlton@husc8.harvard.edu (1991-11-19)
Re: Current work in compiler/language design. chambers@cs.washington.edu (1991-11-18)
Re: Current work in compiler/language design. sverker@sics.se (1991-11-19)
Re: Current work in compiler/language design. ea08+@andrew.cmu.edu (Eric A. Anderson) (1991-11-19)
Re: Current work in compiler/language design. objsys@netcom.com (1991-11-20)
Re: Current work in compiler/language design. nick@dcs.edinburgh.ac.uk (Nick Rothwell) (1991-11-21)
Re: Current work in compiler/language design. pardo@cs.washington.edu (1991-11-21)
Re: Current work in compiler/language design. hasan@emx.utexas.edu (1991-11-21)
[4 later articles]
| List of all articles for this month |
Newsgroups: comp.compilers
From: chambers@cs.washington.edu (Craig Chambers)
Keywords: design
Organization: University of Washington Computer Science and Engineering
References: 91-11-030 91-11-060
Date: Mon, 18 Nov 91 23:36:33 -0800

In article 91-11-066 Bob Hathaway writes:
[ a whole lot of stuff ]


[re: parallelization is a small area]


As most readers of this list are probably already aware, research in
automatic parallelization of dusty-deck programs and development of new
languages supporting even better parallelization is a *very* hot topic; just
scan through the contents of recent SIGPLAN PLDI proceedings (the "flagship"
conference for compiler papers and practical language papers). Of course,
there are lots of other interesting topics covered in those proceedings as
well. POPL is the other "flagship" conference, particularly for more
theoretical work, both in compiler algorithms and in language design issues
such as type theory.


[re: Lisp is slow, interpreted, etc.]


Clearly this poster is not aware of much of the research of the past decade
in the Lisp world. Most Lisp systems are compiled nowadays, achieving
speeds close to that of traditional languages. I measured some benchmarks
recently on the current T compiler (described in a SIGPLAN'86 paper by Kranz
et al., incidentally) and the Allegro Common Lisp compiler, and for "normal"
dynamically-typed Lisp code, these systems were running within a factor of 5
(10 for the worst offenders) of optimized C. After hand-tuning, adding type
declarations, turning on the optimizers full-blast, turning off safety,
sacrificing overflow checking, etc., I got these benchmarks to run close to
half the speed of optimized C. [Begin advertisement: This speed is a bit
slower than Self on the same benchmarks, with no hand-tuning. End
advertisement.] Quite a bit better than the hundreds of times slower cited
by the previous poster.


[re: multi-methods are not messages]


Of course multi-methods are messages! Of course they extend traditional
singly-dispatched receiver-oriented messages to instead dispatch potentially
on multiple arguments! I'm not sure if the poster understands multi-methods
as implemented in languages like CLOS. Multi-methods have their cost, in
terms of subtly changing the feel for programming, away from
data-abstraction-oriented programming and towards more procedure-oriented
programming, but this does not mean that multi-methods are somehow not OO.


[re: everything that is good is OO, by definition]


I don't think I need to rebut this content-free assertion. In my view, OOP
has very specific characteristics, and the quality of these characteristics
can be judged externally. OOP extends traditional abstract data types
(which already support encapsulation, modularization, reuse of code through
instantiation, etc.) with two features: dynamic binding of message names to
method implementations and inheritance of the implementation of one ADT into
another. In statically-typed OO languages, subtyping is a necessary
consequence of dynamic binding (not inheritance as is commonly believed).
Both of these features have their advantages and disadvantages; neither is
uniformly good. In my view, their advantages outweigh their disadvantages,
but I am not blind to the possible drawbacks.


[re: trends towards OO]


Much of these trends in terms of books published, etc., are just people
jumping on the bandwagon resulting from incredible over-hype of OO. I doubt
that the contributions in the OO area merit the number of snake-oil salesmen
who are making money off of the new buzzword. If anything, OO marketing
hype is benefitting from people who never heard of ADT's before. Also,
other paradigms have their own conferences: OOP has OOPSLA and ECOOP,
functional programming has LFP and FPCA (and others), logic programming has
its conferences, etc. There are special purpose conferences for nearly all
languages and programming paradigms; OOPSLA may happen to be prominent
because of the large number of attendees who have come to find out what the
hype is all about.


Since I too am an OOP advocate, I am dismayed by others who promote OOP in
this way. I hope that readers of previous postings do not dismiss OOP, and
the research being done in OOP, as similarly content-free, blind to previous
and related work, and oversold. On the contrary, I believe that OOP is a
very interesting area for current research, and that OO ideas can be
integrated into other domains. Of course, ideas from other domains should
and are being integrated into OOP, including such supposedly minor and
limited fields as parallel programming, logic programming, and functional
programming. Any good computer scientist, particularly a good language
designer, should be familiar will all these various approaches to developing
and understanding programs, partly to avoid ignorant flaming.


-- Craig Chambers
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.