[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: MIT discovered issue with gcc



On 27/11/13 13:10, Wade Richards wrote:
Also, the deeper you get into the optimized code, the harder it is to
issue meaningful source-level warnings.  E.g. when the compiler optimizes:
static int decimate(x) { return x/10; }
int foo() {
   int a=INT_MAX;
   int b;
   for(i=0; i<100; ++i) { b=max(i, decimate(a*10));}
   return b;
}

into

int foo() { return INT_MAX; }

What warnings should appear for which lines?

Hi, thanks for the reply. I really hope I'm not missing your point here, but here it goes:

Speaking as a programmer, the following would be nice:

"Warning (or error): a*10 can cause signed integer overflow on line 5 which is undefined behavior. Not optimizing anything beyond this point for the rest of the function."

If I'm sure this is what I intend, and not to pay the non-optimization penalty, I would (and should, anyway) rewrite it like this:

int foo() {
   int a=INT_MAX;
   int b;
   int i;
   for(i=0; i<100; ++i) { b=max(i, a);}
   return b;
}

... which, assuming a good max() function, would fall into the confines of defined behavior and thus, the compiler should feel free to optimize away whatever it wants without making crazy assumptions. The concept is: because it is defined behavior, the compiler knows what the code flow will be.

This would make both, my code and the compiler predictable.

In the following code, the compiler should feel free to optimize away without throwing any errors:

int foo() {
   int a=INT_MAX / 11;
   int b;
   int i;
   for(i=0; i<100; ++i) { b=max(i, decimate(a*10));}
   return b;
}

In the following case I should get a warning because 'a' is not confined:

int foo(int a) {
   int b;
   int i;
   for(i=0; i<100; ++i) { b=max(i, decimate(a*10));}
   return b;
}

The following code should be fully reduced to "return a":

int foo(int a) {
   if (a > 214748364)
	return -1; /* or whatever */
   int b;
   int i;
   for(i=0; i<100; ++i) { b=max(i, decimate(a*10));}
   return b;
}

Not sure if I'm making my point: I don't think any programmer would ever want his program to go into UB intentionally. Consequently, the compiler should never ever assume or suppose or guess anything at all. It should "infer" always, but never "suppose" anything. Using UB implies supposing as opposed to inferring.

Best regards.


Reply to: