Reference Works wanted: compiler techniques for parallel hardware?

Ray Dillinger <bear@sonic.net>
2 Jul 2005 20:21:52 -0400

          From comp.compilers

Related articles
Reference Works wanted: compiler techniques for parallel hardware? bear@sonic.net (Ray Dillinger) (2005-07-02)
Re: Reference Works wanted: compiler techniques for parallel hardware? gneuner2@comcast.net (George Neuner) (2005-07-05)
Re: Reference Works wanted: compiler techniques for parallel hardware? drizzle76@gmail.com (dz) (2005-07-11)
| List of all articles for this month |
From: Ray Dillinger <bear@sonic.net>
Newsgroups: comp.compilers
Date: 2 Jul 2005 20:21:52 -0400
Organization: Compilers Central
Keywords: parallel, question
Posted-Date: 02 Jul 2005 20:21:52 EDT

Are there any well-developed strategies for developing languages that
take advantage of parallel hardware?


When people want to take advantage of DSP's or Graphics coprocessors
that do a lot of parallel and/or dataflow stuff, it seems like they
almost always wind up going down to the bare wires and coding in
assembler. IOW, the compiler writers so far have been failing these
people. And it's hard to imagine language constructs that allow such
things to be used in a general and efficient way.


So, I'm looking for reference works; have any books been written about
the design of languages to support parallel and dataflow operations on
massively concurrent hardware like DSP's, or about how compilers for
them are implemented? It seems like code generation would become a
nonlinear problem.


Bear
[It is my impression that DSPs and graphics processors are quirky enough
that it's difficult to come up with code generation strategies that
produce code you want to work. For conventional vector and parallel
machines, there's plenty of work that compiles Fortran and C, perhaps
with some hints in the source code. -John]



Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.