|GPU-aware compiling? firstname.lastname@example.org (Tomasz Chmielewski) (2005-05-20)|
|Re: GPU-aware compiling? email@example.com (Michael Tiomkin) (2005-05-22)|
|Re: GPU-aware compiling? firstname.lastname@example.org (Oleg V.Boguslavsky) (2005-05-22)|
|Re: GPU-aware compiling? email@example.com (firstname.lastname@example.org) (2005-05-24)|
|Re: GPU-aware compiling? email@example.com (Rob Dimond) (2005-05-24)|
|Re: GPU-aware compiling? firstname.lastname@example.org (2005-05-24)|
|Re: GPU-aware compiling? email@example.com (Ray Dillinger) (2005-06-26)|
|Re: GPU-aware compiling? firstname.lastname@example.org (Julian Stecklina) (2005-07-02)|
|From:||Julian Stecklina <email@example.com>|
|Date:||2 Jul 2005 20:19:54 -0400|
|References:||05-05-184 05-05-207 05-06-131|
Ray Dillinger <firstname.lastname@example.org> writes:
> And although I haven't been able to work out the kinks and get stuff
> to run directly on them, I'm convinced that the kinks are workable-out
> and that the DSP's in the graphics card can speed up this kind of work
> by an order of magnitude or more with adequate compiler support.
If you are near an university you might want to check out if they have
any NEC SX-style boxes around. Your description of the problem sounds
as if it would be the ideal work for a vector computer.
"We were not out to win over the Lisp programmers; we were after the
C++ programmers. We managed to drag a lot of them about halfway to
Lisp." - Guy Steele, Java spec co-author
Return to the
Search the comp.compilers archives again.