Related articles |
---|
re: Shared-memory consistency: register file chase@world.std.com (David Chase) (2002-05-27) |
From: | David Chase <chase@world.std.com> |
Newsgroups: | comp.compilers |
Date: | 27 May 2002 01:21:37 -0400 |
Organization: | Compilers Central |
Keywords: | parallel |
Posted-Date: | 27 May 2002 01:21:36 EDT |
>Specifically, work which relates to a weak consistency model, where-by
>synchronization "instructions" are known and thus consistency must
>only be guaranteed at these points - ala storing all shared variables
>before the synchronization primitive and ensuring no old values live
>in the register file by issuing reloads, if necessary, after the
>synchronization.
There's several leads you might follow. I think that
Bull Pugh's work on the Java memory model might be
helpful.
http://www.cs.umd.edu/~pugh/java/memoryModel/
and in particular
http://www.cs.umd.edu/~pugh/java/memoryModel/semantics.pdf
There was work done at Dec/Compaq/HP on a research Java compiler which
used Cliff Click's optimization framework, and one of the interesting
things that they found was that they need to (and a way to) model
changes to memory and synchronization points to prevent disaster.
Can't recall the name of the stuff, but it's no doubt in one of their
tech reports.
What Pugh's work points out is the need to chart a careful course
between uselessly lax (compilers can do amusing things to your code)
and ridiculously overspecified. The original Java memory model has
several silly consequences that no sensible program depends on. So,
you've not only got to have semantics, you've got to understand what
they mean and they've got to mean "what people want".
David Chase
Return to the
comp.compilers page.
Search the
comp.compilers archives again.