|Programming language specification languages email@example.com (2001-09-20)|
|Re: Programming language specification languages firstname.lastname@example.org (2001-09-25)|
|Re: Programming language specification languages email@example.com (Joachim Durchholz) (2001-10-06)|
|I'm russian! was character sets firstname.lastname@example.org (2001-10-13)|
|Re: I'm russian! was character sets email@example.com (2001-10-14)|
|Re: I'm russian! was character sets firstname.lastname@example.org (Thomas Maslen) (2001-10-20)|
|Unicode, was: I'm Russian! email@example.com (Ray Dillinger) (2001-11-25)|
|Re: Unicode, was: I'm Russian! firstname.lastname@example.org (Martin von Loewis) (2001-11-26)|
|Date:||13 Oct 2001 23:07:14 -0400|
|References:||01-09-087 01-09-106 01-10-021|
|Posted-Date:||13 Oct 2001 23:07:14 EDT|
"Joachim Durchholz" <email@example.com> wrote...
> RKRayhawk <firstname.lastname@example.org> wrote:
> > It seems worth questioning whether professionals nowadays ought to
> > be oriented to 8-bit foundations.
> Well, this depends entirely on programming language. For a
> programming language, I'm still strongly with 7-bit ASCII. This is
> because I want my software to be portable: across locales (which means
> I have to write in English, and I don't really need more than 7-bit
> ASCII), and across operating systems (which means it should be
> representable on the common EBCDIC code pages).
> Besides, there are some uniqueness issues. For example, the letter "a"
> is present both in Latin and Cyrillic alphabets, but the Cyrillic
> variant has a code point that's different from the Latin one. I'm not
> sure whether it's a real issue, and I'd like to hear any personal
> experience from, say, Russian Java programmers. Anyway, I'm suspicious
> about the issue; programming requires precision, and this uniqueness
> issue is another source of imprecision.
I'm russian. What are you talking about? The notation of programming
language and objects (variables, classes...etc)naming rules can be
pure 7-bit ASCII, but to satisfy Russia!!!(... do not forget our
nuclear missiles) we must be able to write and manipulate strings in
WIN 1251 code page(8bit), or unicode(16bit). To come to an agreement
with China is much difficult, love us, we not need a lot! :)
> Of course, scripting languages intended for the hand of the end user
> *should* be able to support 16-bit characters (which, today, means >
Yes. String, chracter and character arrays types and constants must
support 16 bit representation. At least.
Return to the
Search the comp.compilers archives again.