Related articles |
---|
[5 earlier articles] |
Re: Language design question joachim.durchholz@halstenbach.com.or.de (Joachim Durchholz) (2000-02-16) |
Re: Language design question joachim.durchholz@halstenbach.com.or.de (Joachim Durchholz) (2000-02-17) |
Re: Language design question kst@cts.com (Keith Thompson) (2000-02-19) |
Re: Language design question thp@roam-thp2.cs.ucr.edu (Tom Payne) (2000-02-19) |
Re: Language design question Andrew.Walker@nottingham.ac.uk (Dr A. N. Walker) (2000-02-27) |
Re: Language design question hannah@mamba.pond.sub.org (2000-03-21) |
Re: Language design question frederic_guerin@yahoo.com (Frederic) (2000-03-25) |
Re: Language design question world!bobduff@uunet.uu.net (Robert A Duff) (2000-03-25) |
From: | Frederic <frederic_guerin@yahoo.com> |
Newsgroups: | comp.compilers |
Date: | 25 Mar 2000 02:31:22 -0500 |
Organization: | Compilers Central |
References: | 00-02-065 |
Keywords: | design |
I just want to point something which does't seem that much an obvious
thing to me.
Mr Flisakowski proposed the following example:
> type TRec = struct
> {
> a: short;
> }
>
> var p: pointer to pointer to pointer to TRec;
> var r: pointer to TRec;
>
> p = r; // Ok, equiv to: **p = r
> r = p; // Ok, equiv to r = **p
Why should (p = r) be equivalent to < **p = r >
instead of < ***p = *r > ?
I think that in an auto-dereferencing language, dereferencing should
go to the maximum level by default. Then you may use a referencing
operator to monitor this *maximum* level. In this way, < **p =r >
would be coded like (&p = &r).
Auto-dereferencing sounds a nice feature to me. It hides the deepness
of the referencing chain to the programmer and thus agrees with the
information hiding principle.
Frederic Guerin
Return to the
comp.compilers page.
Search the
comp.compilers archives again.