From: | BGB <cr88192@hotmail.com> |
Newsgroups: | comp.compilers |
Date: | Fri, 09 Mar 2012 18:16:46 -0700 |
Organization: | albasani.net |
References: | 12-03-012 12-03-014 |
Keywords: | design, history |
Posted-Date: | 09 Mar 2012 23:29:33 EST |
On 3/8/2012 8:21 AM, SLK Systems wrote:
>> Personally, I'd say there's been precious little new in programming
>> languages since Simula gave us OOP in the late 1960s.
> Yes, and milestones prior to that were
...
> C language - standardizing the syntax of procedural programming
> Wintel - standardizing the sub-programming language layer
...
> [Some of us who programmed in ANSI Standard Fortran 66 and PL/I 76
> might take issue with the claim that C standardized procedural
> programming. Standard high level procedural interfaces to operating
> systems aren't new either, Burroughs had them in Algol in the 1960s.
> -John]
I think the claim was not that C standardized procedural programming,
but rather that it standardized the syntax.
for example, C++, Java, and C#, all use a very similar core syntax to C.
many other modern languages more or less borrow elements of C's syntax
(for example, JavaScript and ActionScript syntax is still fairly close,
as are many parts of PHP syntax, ...).
granted, one can argue:
maybe C syntax followed from languages like PL/I and ALGOL, but between
them and C, there is still a fair bit of a jump, whereas the syntax
differences between, say, C and many of the languages which followed
after it were considerably less drastic.
OTOH, it is a little harder to find languages with syntax more obviously
derived from Fortran or PL/I than from C.
but, I think the issue mostly is that both "innovation" and "pure
research" are often over-rated, and what is needed at this point may not
be the creation of fundamentally new (or even entirely consistent)
languages, but rather refinement, integration, and adaptation to new
domains.
at this point, mainstream languages are still struggling to incorporate
many features (such as closures) which have been known in other
languages for decades.
for example, both C++ and Java have recently added closures, and in both
cases, they come off as fairly poor attempts at doing so.
so, better I think is trying to invest effort in creating "solid"
languages which can effectively integrate much of what exists and seems
to work well in-general, even at the cost of many of the more
academically inclined are liable to make accusations of "blub" at such
things (mostly due to things like syntactic and semantic similarity with
mainstream languages).
it is kind of harder being a language designer who doesn't particularly
value novelty (people often refer to it as "innovation", but I
personally see true innovation as finding a better way to approach a
problem, rather than that of being needlessly different).
I guess it is also mixed with the problem of many people confusing
interface for its implementation, thinking that if something looks a
certain way it must also be implemented a certain way, or conversely
that if something looks different it must also be conceptually different
as well.
I also tend to see needless minimalism as, well, needless. simpler
syntax doesn't mean a simpler or easier to use language, and more so
doesn't mean a simpler implementation.
some people also make accusations of "keeping every onion", but as I see
it, keeping common syntax and features by no means implies that one
slavishly follows every possible rule.
likewise goes for, on the other side, refusing to adopt some particular
library or implementation of a feature does not mean one is
automatically "creating a standard of non-standard" (seriously? can't
one implement something as they see fit without everyone automatically
assuming that it is necessarily broken and/or incompatible with existing
implementations?).
likewise, can't a person be free to pick and choose things as they seem
useful, reusing any old parts which seem useful, and creating new parts
and technologies if/when it seems these could be more useful?
like, just because it looks sort of like mainstream languages, doesn't
mean that it inherits all of their semantic limitations as well, nor
necessarily uses the same scope model, nor even that the treatment of
statements and expressions is the same.
for example (in my language):
I use a delegation-based scope model (more like that in Self), rather
than strict lexical scoping (lexical scoping is still present though);
the line between statements and expressions is much more lax
(semantically, nearly everything is an expression);
the object model is different in many ways (it is neither strictly
Class/Instance nor Prototype OO);
the typesystem isn't strictly normal either (it uses a mix of static,
dynamic, and inferred types, and the type-system is more of a hybrid of
a Scheme-like and C-like type-system than it is "one or the other");
...
but, practically, a person can still write code like they would in a
more conventional language, and it will still work mostly as expected
(most added features are mostly non-intrusive: if you don't use it, you
don't need to deal with it).
for example, unless one goes and writes a piece of code making use of
it, the differences between lexical and delegation-based scoping are not
likely to be obvious, at least until one starts daisy-chaining objects
and environment frames together (this is actually how my VM implements
packages and importing, FWIW). in-fact, it is possible to link ones'
scoping into a cyclic graph as well, and the VM allows this (it detects
and ignores cycles in the scope graph).
so, a unified model exists, and on the surface, it is exposed (in a
"weaker" form) via the syntactic sugar known as "packages" and "imports"
("namespaces" and "using" in C# or C++ terms), but, if a person wants,
they can use a few modifiers and start directly using the underlying
semantic model as well (similar goes for classes: they are partly "real"
classes, and are partly syntax sugar over a Prototype-OO based model).
there are also many other types of syntax sugar as well (properties,
operator overloading, ...).
...
but, many people apparently see a C-family syntax and automatically
judge it negatively as a result, whereas I happen to feel that the
syntax works fairly well and personally see no "obviously better"
solution (either functionally or aesthetically).
but, I saw little reason to break things which, IMO, actually tend to
work fairly well.
ironically though, I don't have "user defined syntax", but this is
partly as I personally felt that this one is a misfeature (I see more
potential drawbacks than merits regarding this one).
now, whether or not there is much value in being a language
designer/implementer is a secondary issue.
ultimately, a language needs to effectively serve my own uses before it
can have much hope of being useful to anyone else. I have no personal
need or expectation to "change the world".
and, even if none of the features are terribly original, where is the
problem?
Return to the
comp.compilers page.
Search the
comp.compilers archives again.