I tested right shift with Visual Studio, Ubuntu's GCC, Intel compiler, MinGW. All shift in the sign bit. I guess Xcode's GCC does the same.
I know that the behavior is implementation specific, but it looks like that all major desktop/server compilers implement arithmetic shift. Are there any widely used compiler that doesn't shift in the sign bit?
Thank you.
C runs on a lot of different architectures. I mean a lot of different architectures. You can get C code running on an embedded DSP and on a Cray supercomputer.
Most of the "implementation-defined" parts of the C standard that people take for granted really only do break on obscure architectures. For example, there are DSPs and Cray supercomputers where CHAR_BIT
is something huge like 32 or 64. So if you try out your code on an x86, and maybe if you're generous a PowerPC, ARM, or SPARC, you're not likely to run into any of the really weird cases. And that's okay. Most code these days will always run on a byte-oriented architecture with twos-complement integers and arithmetic shifts. I have no doubt that any new CPU architectures in the foreseeable future will be the same.
But let's look at the two most common representations for integers: two's complement and ones' complement:
switch ((-1) >> 1) {
case 0:
case -0:
puts("Hello, ones' complement world!");
// Possibly sign-magnitude.
break;
case -1:
puts("Hello, two's complement world!");
break;
default:
puts("Hello, computer without arithmetic shift");
break;
}
Don't sweat it. Just stick to /
when you want to divide, and >>
when you need to shift. Even bad compilers are good at optimizing these operations. (And remember that x/2 != x>>1
if x
is negative, unless you're on a ones' complement machine, which is almost certainly not true.)
The standard does guarantee that if (int) x
is not negative, then (int) x >> n == (unsigned) x >> n
, so there is not a lot of room for a compiler to do something completely unexpected.