If the representation of a long int
and a int
are the same on a platform, are they strictly the same? Do the types behave any differently on the platform in any way according to the C standard?
Eg. does this always work:
int int_var;
long long_var;
void long_bar(long *l);
void int_bar(int *i);
void foo()
{
long_bar(&int_var); /* Always OK? */
int_bar(&long_var);
}
I guess the same question applies to short and int, if they happen to be the same representation.
The question arose when discussing how to define a int32_t
-like typedef for an embedded C89 compiler without stdint.h, i.e. as int
or long
and if it would matter.
They are not compatible types, which you can see with a a simple example:
int* iptr;
long* lptr = iptr; // compiler error here
So it mostly matters when dealing with pointers to these types. Similarly, there is the "strict aliasing rule" which makes this code undefined behavior:
int i;
long* lptr = (long*)&i;
*lptr = ...; // undefined behavior
Some another subtle issue is implicit promotion. In case you have some_int + some_long
then the resulting type of that expression is long
. Or in case either parameter is unsigned, unsigned long
. This is because of integer promotion through the usual arithmetic conversions, see Implicit type promotion rules.
Shouldn't matter most of the time, but code such as this will fail: _Generic(some_int + some_long, int: stuff() )
since there is no long
clause in the expression.
Generally, when assigning values between types, there shouldn't be any problems. In case of uint32_t
, it doesn't matter which type it corresponds to, because you should treat uint32_t
as a separate type anyway. I'd pick long
for compatibility with small microcontrollers, where typedef unsigned int uint32_t;
will break. (And obviously, typedef signed long int32_t;
for the signed equivalent.)