oss-sec mailing list archives

Re: gcc 4.2 optimizations and integer overflow checks


From: Florian Weimer <fw () deneb enyo de>
Date: Fri, 11 Apr 2008 21:38:06 +0200

* Steven M. Christey:

gcc 4.2.0 through 4.3.0 in GNU Compiler Collection, when casts are not
used, considers the sum of a pointer and an int to be greater than or
equal to the pointer, which might remove length testing code that was
intended as a protection mechanism against integer overflow and buffer
overflow attacks.

Some remarks are in order, I think.

The version range is a bit misleading.  The bug Nico unearthed affects
additional versions, and the issue is the same from a purely
phenomenological point of view.  The issue is not GCC-specific, either.

I'm also a bit at odds with the description of this issue.

C defines pointer arithmetic to be valid only for pointer values that
point within an allocated array, or one element past the last element of
that array.  Invalid pointers result in undefined behavior immediately,
not just when they are dereferenced.  Since behavior is undefined, a C
implementation can essentially infer that this can never occur (even if
the generated code is "wrong" in that case, behavior is undefined anway,
so that it doesn't matter what the code does), and use this knowledge in
optimizers.

The shortest summary I can come up with is this:

| The C standard permits certain optimizations that break code written
| in a way that assumes pointers behave like machine addresses,
| rendering certain forms of buffer overflow checks ineffective.

I also have a hard time believing that this affects real-world code
which has a reasonable claim to being correct with non-optimizing
compilers (or some hypothetical conservative C variant from the K&R
days).  The code might as well break when moved to a different
architecture, with different pointer layout and comparison instructions
used for pointers.

There's another issue involving the undefinedness of integer overflow,
but this is nothing new and has been documented publicly for years, and
has been the subject of fierce discussion on the GCC mailing list
before.

A somewhat related issue is the operator new[] problem (multiplication
to get the array size is truncated, leading to an allocation which is
too small; comparable to the calloc bugs), which is something I think
should be fixed.  However, it seems that the C++ standard requires
implementations to have that issue, at least for custom allocators.
Yuck.


Current thread: