c

Any difference or risk between"var&0xFFFFFFFFu" vs. "var&0xFFFFFFFF"?


When I need to get lower 32bit of a variable:

Is there any difference between var & 0xFFFFFFFFu vs. var & 0xFFFFFFFF?

Is there any risk if I do not add suffix "u" in any case?


Solution

  • Is there any difference between var & 0xFFFFFFFFu vs. var & 0xFFFFFFFF?

    Commonly, no functional difference.

    0xFFFFFFFF is a hexadecimal constant. Its type is determined by its value to be int, unsigned, long, unsigned long, long long, unsigned long long - the first type the value fits in. It is very commonly an unsigned type given integer widths of 16, 32, 64, etc. Only when this 32-bit value can be first a signed type before an unsigned one, (think 36-bit long) will this constant be signed.

    0xFFFFFFFFu is a hexadecimal constant. Its type is determined by its value to be unsigned, unsigned long, unsigned long long. It is always some unsigned type.

    On common systems, var & 0xFFFFFFFFu and var & 0xFFFFFFFF will result in the same value and type. The type of the and result depends and is affected by var - think unsigned long long vs. long long, vs. int.

    Is there any risk if I do not add suffix "u" in any case?

    No, scant risk on common systems.

    Best to follow your group's coding style here.

    I would recommend appending the u. Little downside to adding that u and likely easier to review and maintain.

    In general, & and other logical operands are best as unsigned types. Signed types often involve extra analysis and incur risks.


    As I do not like naked magic numbers, obliging a reviewer to count Fs nor casting, I recommend the following to get lower 32 bits of a variable:

    var & UINT32_MAX;