I'm pretty much assuming this is a stupid question... but I can't really find the answer for it. So I'm asking this here.
For the purpose of learning about implicit type casting, I'm running the following code on C.
#include <stdio.h>
int main()
{
unsigned char i;
char cnt = -1;
int a[255];
for (int k = 0; k < 255; k++)
{
a[k] = k;
}
for (i = cnt - 2; i < cnt; i--)
{
a[i] += a[i + 1];
printf("%d\n", a[i]);
}
return 0;
}
When I ran this program, nothing happened.
I was able to found out that the loop condition of for-loop was false at the first iteration, so the program exited the for-loop right away.
However, I don't get the reason why.
As far as I know, C does implicit casting when assigning or comparing variables with different types. So I thought that on i = cnt - 2
, the minus operation makes the value -3, and then implicit casting assigns i with a value 253.
Then, shouldn't the condition i < cnt
be true since (by another implicit casting of cnt because of comparison of signed and unsigned char) 253 is smaller than 255?
If it isn't, why is this false? Is there something that I missed or is there some exception that I don't know?
Your question is not stupid at all. You were close to the solution: i
is assigned the value -3
but the implicit conversion to the type of i
, unsigned char
, changes the value to 253
.
For a more precise explanation, there are multiple issues in your test code:
char
may be signed or unsigned depending on the platform and compiler configuration, so char cnt = -1;
may store the value -1
or 255
into cnt
, or even some other value if char
is unsigned and has more than 8 bits.
The behavior of for (i = cnt - 2; i < cnt; i--)
also depends on whether char
is signed or unsigned by default:
in all cases, the test i < cnt
is evaluated with both operands converted to int
(or unsigned int
in the rare case where sizeof(int)==1
). If int
can represent all values of types char
and unsigned char
, this conversion does not change the values.
if char
is unsigned and has 8 bits, cnt
has the value 255
so i
is initialized with the value 253
and the loop runs 254 times with i
from 253
down to 0
, then i--
stores the value 255
again into i
, for which the test i < cnt
evaluates to false. The loop prints 507
, then 759
, ... 32385
.
if char
is signed and has 8 bits, as is probably the case on your system, cnt
has the value -1
and i
is initialized with the value -3
converted to unsigned char
, which is 253
. The initial test i < cnt
evaluates as 253 < -1
, which is false, causing the loop body to be skipped immediately.
You can force char
to be unsigned by default by giving the compiler the appropriate flag (eg: gcc -funsigned-char
) and test how the behavior changes. Using Godbolt's compiler explorer, you can see that gcc
generates just 2 instructions to return 0 in the signed (default) case and the expected output in the unsigned case.