In C, I encounter this weird problem: For the following code:
#include <stdio.h>
#include <math.h>
int main(int argc, char *argv[]) {
unsigned int i = 0;
for (int j = i - 1; j < i + 1; j++) {
printf("RUN1?\n");
}
int j = i - 1;
int k = i + 1;
for (; j < k; j++) {
printf("RUN2?\n");
}
printf("DONE?");
}
The first loop did not loop (0 times) while the second loop can run 2 times. I don’t see what’s fundamentally different.
The below also works...
for (;j < (int)(i+1); j++) {}
I have seen some posts here, but did not find any good explanation.
Maybe someone can link one here?
The difference between the 2 loops is in the types used for the condition:
j < i + 1
. Since i
is unsigned (and so is i+1
) j
is also converted to unsigned
to perform the comparison. As an unsigned
it is a high number and never < i+1
.j < k
. These are both signed values and therefore they are not converted to unsigned
. j
is initialized to -1
when i-1
is converted to signed int
. The result is the one you expected (2 iterations of the loop).Your compiler should warn you about this issue.
E.g. MSVC issues the following warning for the condition in the first loop:
warning C4018: '<': signed/unsigned mismatch
Note:
When the unsigned i-1
is converted to signed you get an overflow. The result is implementation defined. A common result (which you also get) is -1
but it is not guaranteed.