I was trying to search for code to determine the endianness of the system, and this is what I found:
int main()
{
unsigned int i = 1;
char *c = (char *)&i;
if (*c) {
printf("Little Endian\n");
} else {
printf("Big Endian\n");
}
}
How does this code work? More specifically, why is the ampersand needed here in the following typecast?
char *c = (char *)&i;
What is getting stored into the pointer c
? Does the value i
contain it or is the actual address i contained in it? Also, why is this a char
for this program?
While dereferencing a character pointer, only one byte is interpreted (assuming a char
variable takes one byte). And in little-endian
mode,the least-significant-byte
of an integer is stored first.So for a 4-byte integer, say 3, it is stored as
00000011 00000000 00000000 00000000
while for big-endian
mode it is stored as:
00000000 00000000 00000000 00000011
So in the first case, the char*
interprets the first byte and displays 3
, but in the second case it displays 0
.
Had you not typecasted it as:
char *c = (char *)&i;
it would have shown a warning about incompatible pointer type. Had c
been an integer pointer
, dereferencing it would get an integer value 3
irrespective of the endianness, as all four bytes will be interpreted.
NB: You need to initialize the variable i
to see the whole picture. Else a garbage value is stored in the variable by default.