-1 6 ?? comparing negatives with positives = wierd

Started by
9 comments, last by Inmate2993 18 years, 11 months ago
I had this in my code:

unsigned size = 6;
for ( int i = -1; i< size; ++i )
    //some stuff

the for loop didn't run - it thought that i was >= size. As soon as 'i' was changed to 0, however, it worked fine. How come?
my siteGenius is 1% inspiration and 99% perspiration
Advertisement
i is promoted to unsigned. I guess it becomes something like 2^31. If size were signed, it should work.
I'm not sure. I would think the compiler would know not to promote in that way due to inaccuracies, rather size would be "demoted" to signed. Unless the anony is right, I can't see why it wouldn't work right. Also, a solution is to use all signed variables, though I'm sure this was done for curiosity.


If you're using MSVC turn up the warning level to at least 3.

It will spew a warning if you compare signed to unsigned types (and a whole lot more). Try to fix all the warnings.

Fruny: Ftagn! Ia! Ia! std::time_put_byname! Mglui naflftagn std::codecvt eY'ha-nthlei!,char,mbstate_t>

ok. I just converted everything to signed. That works. Thanks - I really had no idea what was going on.
my siteGenius is 1% inspiration and 99% perspiration
I still don't know too well what was going on, so I just avoid using unsigned variables that will be used to compare for that reason. Any that I use are related to their use, OpenGL, WINAPI something like that. Otherwise, my class members, game variables are all signed.


I tried with MSVC.NET and assigning -1 to unsigned int without type-casting results in 2^32. So I believe that was going on.
I think it's undefined behaviour. The same thing comes up if you try and do...
for (unsigned int i = 10; i < 11; --i)
in MSVC v6.
It relies on integer overflow results being a specific behaviour.
"In order to understand recursion, you must first understand recursion."
My website dedicated to sorting algorithms
Quote:Original post by Anonymous Poster
I tried with MSVC.NET and assigning -1 to unsigned int without type-casting results in 2^32. So I believe that was going on.

I'm sure you meant 2^32-1 since there's no way to store 2^32 in a 32 bit variable ;)
I'm 99.99% sure that your compiler is demoting i to an unsigned int. Just change the other variable to a normal int and you should be fine. By the way, you almost never need to use unsigned ints--signed ints generally hold enough.
Quote:
I'm sure you meant 2^32-1 since there's no way to store 2^32 in a 32 bit variable ;)

Maybe he has a screwy 64-bit cpu that wraps int's around to 2^32 for no apparent reason:P

This topic is closed to new replies.

Advertisement