From: | Bart <bc@freeuk.com> |
Newsgroups: | comp.compilers |
Date: | Fri, 3 May 2019 21:10:34 +0100 |
Organization: | virginmedia.com |
References: | 19-05-014 19-04-021 19-04-023 19-04-037 19-04-039 19-04-042 19-04-044 19-04-047 19-05-004 19-05-008 19-05-014 19-05-021 |
Injection-Info: | gal.iecc.com; posting-host="news.iecc.com:2001:470:1f07:1126:0:676f:7373:6970"; logging-data="22662"; mail-complaints-to="abuse@iecc.com" |
Keywords: | arithmetic, optimize, errors |
Posted-Date: | 03 May 2019 21:15:58 EDT |
Content-Language: | en-GB |
On 03/05/2019 16:23, David Brown wrote:
> On 02/05/2019 21:04, Bart wrote: > On 02/05/2019 15:51, David Brown
>> The subject is UB, and whether the possibility of overflow can just be
>> ignored by a language, a language that deals with low-level
>> machine-sized types.
>>
>
> The answer is yes, it can.
>
> In assembly programming - as low as you get - overflow is almost always
> ignored.
Fine. In that case I wanted it ignored for other kinds of languages too.
And that includes a low level language at the level of C.
>> You seem OK with a C compiler assuming that overflow cannot happen so
>> that it can generate slightly faster benchmarks.
>>
>
> I am not the slightest bit interested in performance in benchmarks.
I mean the people behind the compilers are.
>> I prefer a language acknowledging that it could happen, and stipulating
>> exactly what does happen.
>>
>
> There are lots of situations where behaviour is undefined, in /all/
> programming languages. The lower level and more efficient the language,
> the more such situations you get - but none are entirely free of
> undefined behaviour. And the more you fight undefined behaviour, the
> more limited your coding will be. Accept it, realise that it is part of
> the world of programming, and you will get on much better.
I think you've just used too much C.
There's nothing wrong with a language saying it expects to target twos
complement machines with power-of-two word sizes, and that it expects
overflow on such types to be well-defined on those machines.
There's nothing to fight. Certainly I'm not interesting in fighting C.
(Actually, I've dropped C as a possible target from my compilers; there
was simply too much of a struggle to keep C compilers happy, with UB on
innocuous operations being one small part of it. And it was holding me
back from using advanced features difficult to express in C.)
> Then you might as well give up programming, because that can't be done.
Don't forget I normally devise my own languages and write my own
compilers, so I can largely do what I like.
> Only a fool uses unknown data from outside without checking them. Check
> that the data makes sense, then use it. Don't use it first then check
> for carnage afterwards.
I understand that your work is mostly concerned with small embedded
systems. Most of mine for about 15 years involved applications that had
to deal with user input.
> You are, as usual, very keen to pick out gcc as though it was something
> special here.
True. It's gcc /and/ clang (which tries to copy what gcc does), where
I've observed this behaviour, that is not only at odds with other
compilers and languages, but also with themselves.
Return to the
comp.compilers page.
Search the
comp.compilers archives again.