Related articles |
---|
Ignoring Type Differences klapoin1@twcny.rr.com (Ken and Kristi LaPoint) (2001-07-17) |
Re: Ignoring Type Differences joachim_d@gmx.de (Joachim Durchholz) (2001-07-18) |
Re: Ignoring Type Differences vbdis@aol.com (2001-07-18) |
From: | vbdis@aol.com (VBDis) |
Newsgroups: | comp.compilers |
Date: | 18 Jul 2001 20:08:52 -0400 |
Organization: | AOL Bertelsmann Online GmbH & Co. KG http://www.germany.aol.com |
References: | 01-07-074 |
Keywords: | types |
Posted-Date: | 18 Jul 2001 20:08:52 EDT |
"Ken and Kristi LaPoint" <klapoin1@twcny.rr.com> schreibt:
>Can anyone tell me when you might want the compiler to ignore type
>differences in an expression?
IMO a compiler cannot "ignore" type differences, it only can silently
"convert" the items into a common data type, before further
processing. Such conversions require rules, how to make different
types compatible.
The usual implicit conversions handle combinations of signed and
unsigned types, types of different precision, types of different
"nature" (integral, floating point...).
Depending on the language, some types are designed as being
incompatible with other types, like boolean, char, or enum. I
personally appreciate such strict typing, and want the compiler only
convert types of "usage computational", or up/down cast class types as
appropriate. This principle can be reduced to type casts within class
trees, when also numeric types are treated as classes.
DoDi
Return to the
comp.compilers page.
Search the
comp.compilers archives again.