Bug#345435: gcc-3.3: wrong evaluation of ++ expressions
On Saturday 31 December 2005 14:47, Martin Stumpf wrote:
> Package: gcc-3.3
> Version: 1:3.3.5-13
> Severity: normal
>
> The following program evaluates d to 13. On other architectures (and in
> java) the same code evaluates to 12.
> (2.95, 3.3, 3.4, 4.0, 4.1 do have the same bug)
>
> #include <iostream>
>
> using namespace std;
>
> int main(int argc, char* argv[]) {
> int b = 2;
> int d = ++b + ++b + ++b;
> std::cout << "b: " << b << "d: " << d << endl;
> return 0;
> }
>
>
> -- System Information:
> Debian Release: 3.1
> Architecture: i386 (i686)
> Kernel: Linux 2.6.8-2-k7
> Locale: LANG=C, LC_CTYPE=de_DE (charmap=ISO-8859-1)
>
> Versions of packages gcc-3.3 depends on:
> ii binutils 2.15-6 The GNU assembler, linker and
> bina ii cpp-3.3 1:3.3.5-13 The GNU C preprocessor ii
> gcc-3.3-base 1:3.3.5-13 The GNU Compiler Collection (base
> ii libc6 2.3.2.ds1-22 GNU C Library: Shared
> libraries an ii libgcc1 1:3.4.3-13 GCC support
> library
>
> -- no debconf information
It's not a bug but the way the compiler evaluates expressions.
With the -Wall argument, GCC reports the result may be "undefined"
whereas g++ doesn't warn at all.
It's somewhat fascinating 12 can be obtained with a
int d = (b+= ++b, b+= b);
or with a
int d = (++b + b++ , ++b + b++ , b + b) ;
or a
b+=++b; int d = b+=b;
but not int d = ++b + ++b + ++b;
Thanks for pointing this out.
Reply to: