|[47 earlier articles]|
|Re: Have we reached the asymptotic plateau of innovation in programmin email@example.com (Martin Ward) (2012-06-10)|
|Re: Have we reached the asymptotic plateau of innovation in programmin firstname.lastname@example.org (Alex McDonald) (2012-06-10)|
|Re: Have we reached the asymptotic plateau of innovation in programmin email@example.com (robin) (2012-06-11)|
|Re: Have we reached the asymptotic plateau of innovation in programmin firstname.lastname@example.org (2012-06-11)|
|Re: Have we reached the asymptotic plateau of innovation in programmin DrDiettrich1@aol.com (Hans-Peter Diettrich) (2012-06-11)|
|Re: Have we reached the asymptotic plateau of innovation in programmin email@example.com (glen herrmannsfeldt) (2012-06-11)|
|Re: Have we reached the asymptotic plateau of innovation in programmin firstname.lastname@example.org (robin) (2012-06-13)|
|Date:||Wed, 13 Jun 2012 01:16:15 +1000|
|References:||12-03-012 12-03-014 12-06-008 12-06-032 12-06-034|
|Posted-Date:||15 Jun 2012 14:15:52 EDT|
From: "glen herrmannsfeldt" <email@example.com>
> Torben Fgidius Mogensen <firstname.lastname@example.org> wrote:
>>>>>Personally, I'd say there's been precious little new in programming
>>>>>languages since Simula gave us OOP in the late 1960s.
>> I wouldn't say so. Advanced type systems (bounded polymorphism and
>> linear types to name a few) have enetred the picture since.
>> As John mentioned, APL has been around for ages and used a lot of
>> non-ASCII symbols. Algol was originally designed to use several
>> non-ASCII symbols that could be encoded in different ways depending on
>> the local symbol set. ASCII was by no means a standard then --
>> FIELDATA and EBCDIC were common alternatives, so the choice was either
>> to limit the language to use the common subset (which was rather
>> small) or to use an ideal set of symbols and allow these to be
> I thought ALGOL was older than both ASCII and EBCDIC.
Algol 58 preceded both -- 1958. See http://en.wikipedia.org/wiki/ALGOL
Next came ASCII in 1963. See http://en.wikipedia.org/wiki/ASCII
Finally came EBCDIC in 1964, but probably didn't see actual use until
the S/360 in 1965(?). See http://en.wikipedia.org/wiki/EBCDIC Before
that was a 3-zone code for punched card equipment, based on the first
three card rows, denoted Y, X and 0.
ASCII was freer to use a consistent assignment of characters,
with all the letters for a given case in consecutive binary positions.
On the other hand, EBCDIC was constrained by card encodings.
With the introduction of a 4-zone punch card code (Y, X, 0, 8),
many extra characters could be included, mostly punctuation.
For purposes of illustration only, the Y (or leading) row of the card
could contribute, say, 32 to the value of a character;
the X or second row could contribute , say, 48;
while the 0 or third row could contribute, say, 64.
A punching in rows 1 to 9 could contribute the value of the digit.
Thus, alphabetic characters A to I would fall in the range 33 to 41;
J to R fall in the range 49 to 57, while S to Z fall in the range 66 to 73.
Thus, each of the rows Y, X and 0 could contribute a single bit
to the final value for the card column (but in practice,
it is more convenient to translate to two bits),
while rows 1 to 9 are converted to a 4-bit value
This arrangement simplified the electronics for the card reader.
> EBCDIC, and its punched card coding, came with S/360 and the 029
> keypunch. Before that, IBM had BCDIC (a six bit code) and the 026.
> Was going from six bit codes to seven-bit ASCII a great awakening,
> or a big mistake, not going directly to eight bits?
Back in the 1960s and 70s, not many electronic and non-electronic devices
could support more than 64 printable characters, so ASCII was adequate
for many years to come. Those who wanted to could
use upper and lower case with such devices as the ASR 38, Memorex 1240, etc,
and Friden flexowriter (a slightly modified version was available for Algol).
The ubiquitous ASR 33 (mechanical and electronic forms)
used an 8-bit code. The 8th bit could be left blank or could be
used for parity, so 7 active bits proved to be sufficient for the times.
More than 7 would have rendered parity checking impossible,
and computer paper tape readers typically checked parity.
>> ASCII certainly has the advantage of being easy to type using a
>> standard keyboard,
That's because the keyboard was designed for ASCII.
As for Algol 58 (and then Algol 60), it was designed as a
publication language, for which it filled the bill admirably.
Implementing it on a computer was, however, compromised
though the use of many characters not available on I/O
equipment. The lack of a means of back-spacing on most
preparation equipment meant that alternatives such as reserved words,
and apostrophised keywords were sought as a substitute for underlining,
upper-case everything, and of course, I/O statements differed
from installation to installation.
But, as they say, it's the thought that counts!
Return to the
Search the comp.compilers archives again.