I'm russian! was character sets

crystal-pin@mail.ru (alys)
13 Oct 2001 23:07:14 -0400

          From comp.compilers

Related articles
Programming language specification languages nmm1@cus.cam.ac.uk (2001-09-20)
Re: Programming language specification languages rkrayhawk@aol.com (2001-09-25)
Re: Programming language specification languages joachim_d@gmx.de (Joachim Durchholz) (2001-10-06)
I'm russian! was character sets crystal-pin@mail.ru (2001-10-13)
Re: I'm russian! was character sets spinoza1111@yahoo.com (2001-10-14)
Re: I'm russian! was character sets tmaslen@wedgetail.com (Thomas Maslen) (2001-10-20)
Unicode, was: I'm Russian! bear@sonic.net (Ray Dillinger) (2001-11-25)
Re: Unicode, was: I'm Russian! loewis@informatik.hu-berlin.de (Martin von Loewis) (2001-11-26)
| List of all articles for this month |

From: crystal-pin@mail.ru (alys)
Newsgroups: comp.compilers
Date: 13 Oct 2001 23:07:14 -0400
Organization: http://groups.google.com/
References: 01-09-087 01-09-106 01-10-021
Keywords: i18n
Posted-Date: 13 Oct 2001 23:07:14 EDT

"Joachim Durchholz" <joachim_d@gmx.de> wrote...
> RKRayhawk <rkrayhawk@aol.com> wrote:
> > It seems worth questioning whether professionals nowadays ought to
> > be oriented to 8-bit foundations.
>
> Well, this depends entirely on programming language. For a
> programming language, I'm still strongly with 7-bit ASCII. This is
> because I want my software to be portable: across locales (which means
> I have to write in English, and I don't really need more than 7-bit
> ASCII), and across operating systems (which means it should be
> representable on the common EBCDIC code pages).
>
> Besides, there are some uniqueness issues. For example, the letter "a"
> is present both in Latin and Cyrillic alphabets, but the Cyrillic
> variant has a code point that's different from the Latin one. I'm not
> sure whether it's a real issue, and I'd like to hear any personal
> experience from, say, Russian Java programmers. Anyway, I'm suspicious
> about the issue; programming requires precision, and this uniqueness
> issue is another source of imprecision.


I'm russian. What are you talking about? The notation of programming
language and objects (variables, classes...etc)naming rules can be
pure 7-bit ASCII, but to satisfy Russia!!!(... do not forget our
nuclear missiles) we must be able to write and manipulate strings in
WIN 1251 code page(8bit), or unicode(16bit). To come to an agreement
with China is much difficult, love us, we not need a lot! :)


> Of course, scripting languages intended for the hand of the end user
> *should* be able to support 16-bit characters (which, today, means >
> Unicode).


Yes. String, chracter and character arrays types and constants must
support 16 bit representation. At least.


Regards, alys.


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.