From: | "Joachim Durchholz" <joachim_d@gmx.de> |
Newsgroups: | comp.compilers |
Date: | 6 Oct 2001 16:36:24 -0400 |
Organization: | Compilers Central |
References: | 01-09-087 01-09-106 |
Keywords: | design, i18n |
Posted-Date: | 06 Oct 2001 16:36:24 EDT |
RKRayhawk <rkrayhawk@aol.com> wrote:
> It seems worth questioning whether professionals nowadays ought to
> be oriented to 8-bit foundations.
Well, this depends entirely on programming language. For a
programming language, I'm still strongly with 7-bit ASCII. This is
because I want my software to be portable: across locales (which means
I have to write in English, and I don't really need more than 7-bit
ASCII), and across operating systems (which means it should be
representable on the common EBCDIC code pages).
Besides, there are some uniqueness issues. For example, the letter "a"
is present both in Latin and Cyrillic alphabets, but the Cyrillic
variant has a code point that's different from the Latin one. I'm not
sure whether it's a real issue, and I'd like to hear any personal
experience from, say, Russian Java programmers. Anyway, I'm suspicious
about the issue; programming requires precision, and this uniqueness
issue is another source of imprecision.
Of course, scripting languages intended for the hand of the end user
*should* be able to support 16-bit characters (which, today, means
Unicode).
Regards,
Joachim
Return to the
comp.compilers page.
Search the
comp.compilers archives again.