From: | "Rodney M. Bates" <rodney.bates@wichita.edu> |
Newsgroups: | comp.compilers |
Date: | 27 Jan 2003 23:31:48 -0500 |
Organization: | EarthLink Inc. -- http://www.EarthLink.net |
References: | 03-01-065 03-01-070 03-01-126 03-01-164 |
Keywords: | optimize |
Posted-Date: | 27 Jan 2003 23:31:48 EST |
"John R. Strohm" wrote:
>
> The fundamental problem is that everything is thinking in terms of
> "value semantics" or "reference semantics", rather than "in" "out" or
> "in out", to steal the Ada terminology. I suspect that this probably
> has something to do with lack of exposure to any language other than
> C.
>
> Ada's approach is that parameters are either input parameters, output
> parameters, or update parameters. An input parameter cannot be
> altered by the subprogram. An output parameter cannot be READ by the
> subprogram, only written. An update parameter can be both read and
> written.
>
> Observe that, when the language is defined this way, the programmer no
> longer CARES about "value semantics" vs. "reference semantics". The
> programmer CANNOT screw himself by e.g. writing into a data structure
> that is intended to be constant but which, for efficiency and storage
> reasons, must be passed by reference. Nor can he screw himself by
> writing randomly into the ether when he treats something as reference
> that was passed by value. Further, the compiler is pretty much free
> to choose call-by-value or call-by-reference on a case-by-case basis.
Actually, this is somewhat of a time-bomb in Ada. On some cases, the
language specifies the actual parameter transmission mechanism. But in
some combinations of types and transmission modes, the language leaves
it up to the compiler writer whether to use reference or some
variation of copy in/out, etc. In the presence of non-local variable
references, aliasing, etc., this can affect program semantics.
If you write an Ada program that depends on which mechanism the
compiler chooses, the language defines your program as "erroneous",
which means you will not get any error message and whatever happens is
entirely your fault. I'm sure erroneousness is undecidable.
It is feasible to get all or nearly all of this kind of neither
defined nor detected stuff out of programming languages and still keep
them practical and efficient. The trends today are quite the
opposite. As most readers in this group well know, Ada is far from
the worst in this regard. The world deserves better.
--
Rodney M. Bates
Return to the
comp.compilers page.
Search the
comp.compilers archives again.