From: | Martin Ward <martin@gkc.org.uk> |
Newsgroups: | comp.compilers |
Date: | Fri, 3 May 2019 10:52:27 +0100 |
Organization: | Compilers Central |
References: | 19-04-021 19-04-023 19-04-037 19-04-039 19-04-042 19-04-044 19-04-047 19-05-004 19-05-006 19-05-016 |
Injection-Info: | gal.iecc.com; posting-host="news.iecc.com:2001:470:1f07:1126:0:676f:7373:6970"; logging-data="601"; mail-complaints-to="abuse@iecc.com" |
Keywords: | design, errors |
Posted-Date: | 03 May 2019 13:35:36 EDT |
On 03/05/19 00:48, Bart wrote:
> And I think that if a program can
> go seriously wrong through unchecked input, then that's a failure in
> proper validation. It's rather sloppy to rely on a runtime check put
> their by a compiler.
The car analogy for C is that C is a car with no seatbelts, crumple
zones, roll bars, airbags etc. The car manual explicitly states that
nudging the kerb with any tyre is "undefined behaviour" and could
cause the car to explode in a fireball, killing all the passengers.
On 2019-05-01, David Brown <david.brown@hesbynett.no> wrote:
> Detecting signed overflow at run-time can be a significant cost.
Firstly: the cost is not as high as the cost of security breaches due
to buffer overflows. Secondly: if many popular languages specified
suitable handling for signed overflow, buffer overruns and so on, then
CPUs hardware would be developed which makes these tests efficient:
because compiled code in these popular languages would run faster on
such CPUs.
> I was talking about a /dimension/ of 2 billion - that is, a width or
> height of 2 billion.
If you are reading from an unknown file (eg an image on a web page)
then it would be foolish to assume that no dimension is bigger that 2
billion: security breaches due to carefully constructed image files
have occurred in the past. Also, the netpbm library can be used for
files containing data which is *not* image data: for example, as
generic utilities for processing huge bit strings. These bit strings
might well contain more than 2 billion bits (250 MB of data).
Back in the early days of Unix there were many utilities for
processing text files. It was discovered that many of these would
crash or hang when fed random binary data:
https://www.fuzzingbook.org/html/Fuzzer.html
ftp://ftp.cs.wisc.edu/paradyn/technical_papers/fuzz-revisited.ps
This is a problem because (1) a text utility can be used as a
general-purpose data manupulation program which is fed binary data (2)
more importantly: each crash is a potential security hole.
--
Martin
Dr Martin Ward | Email: martin@gkc.org.uk | http://www.gkc.org.uk
G.K.Chesterton site: http://www.gkc.org.uk/gkc | Erdos number: 4
Return to the
comp.compilers page.
Search the
comp.compilers archives again.