Is there something in the C standard that implies only one representation for signed integers should be used? Specifically, does the standard prohibit using, for example, one's complement for int variables and two's complement for long int variables on the same machine?
If this is related to the hardware rather than the compiler, is there any hardware that allows the existence of two different representations for signed integers on the same machine?
Versions of the C standard prior to 2024 did not specify that different signed integer types had to use the same choice of two’s complement, one’s complement, or sign-and-magnitude.
C 2024 specifies that signed integer types use two’s complement.