From: | "John R. Strohm" <strohm@airmail.net> |
Newsgroups: | comp.compilers |
Date: | 26 Jan 2003 16:33:35 -0500 |
Organization: | Compilers Central |
References: | 03-01-065 03-01-070 03-01-126 |
Keywords: | design |
Posted-Date: | 26 Jan 2003 16:33:35 EST |
> > Making arguments read-only is not sufficient to ensure that
> > passing variables by reference or by value will be completely
> > transparent to the user. ...
>
> I can't think of a better solution than to enforce a mindset that any
> parameter might be passed by reference and the programmer must not
> rely on any assumptions on the passing method. If reference semantics
> are desired, a pointer should be used. This seems more reasonable
> than always enforcing a pass-by-value policy.
>
> [Gee, we're reinventing Fortran. -John]
The fundamental problem is that everything is thinking in terms of
"value semantics" or "reference semantics", rather than "in" "out" or
"in out", to steal the Ada terminology. I suspect that this probably
has something to do with lack of exposure to any language other than
C.
Ada's approach is that parameters are either input parameters, output
parameters, or update parameters. An input parameter cannot be
altered by the subprogram. An output parameter cannot be READ by the
subprogram, only written. An update parameter can be both read and
written.
Observe that, when the language is defined this way, the programmer no
longer CARES about "value semantics" vs. "reference semantics". The
programmer CANNOT screw himself by e.g. writing into a data structure
that is intended to be constant but which, for efficiency and storage
reasons, must be passed by reference. Nor can he screw himself by
writing randomly into the ether when he treats something as reference
that was passed by value. Further, the compiler is pretty much free
to choose call-by-value or call-by-reference on a case-by-case basis.
Return to the
comp.compilers page.
Search the
comp.compilers archives again.