From: | "Mark" <whopkins@alpha2.csd.uwm.edu> |
Newsgroups: | comp.compilers |
Date: | 24 Nov 2002 01:27:33 -0500 |
Organization: | University of Wisconsin - Milwaukee, Computing Services Division |
References: | 02-11-083 02-11-097 02-11-110 |
Keywords: | design, types |
Posted-Date: | 24 Nov 2002 01:27:33 EST |
"Peter Flass" <peter_flass@yahoo.com> writes:
>I believe (IMHO) that nowadays the consensus is that having to declare
>everything is a "good thing" since it acts as a check against spelling
>errors.
I'm of the opinion that the language should have a type system that
is essentially an implementation of a suitable variant of Category
Theory, but that typing should be completely self-deduced and invisible
to the programmer.
That way you can throw out a huge part of the language's syntax (along
with the extra drag it presents on the learning curve) and focus all
the language's resources on what it's meant for: specifying comptuation
and synchronization -- all without losing the discipline of an extremely
powerful type system.
It's also the only true way to fully achieve the ideal of complete and
unfettered polymorphism.
This latter feature, of course, brings up the true reason for typing.
It's not for the programmer's benefit at all, but for the compiler's.
Without it, the language either has to be designed so well that the
sequence
(A = B, B = C, C = A)
can always be compiled the same way, or the compiler's going to have
to know how big the things will be that A, B and C translate to, or
[re]translate them differently in different contexts depending on
what the language's internal typing says.
For interpreted languages, however, there is no issue with sizing.
So, there's no real obstacle there.
Return to the
comp.compilers page.
Search the
comp.compilers archives again.