c++binaryhexchessbitboard

Strange error when bit shifting uint64_t by a uint16_t in cpp


The function below is attempting to create a bitboard with the set bit being in the Nth position, by bitshifting 0x1 N times to achieve desired result. N is being given by the 1-6th least significant bits in a uint16_t. It is then masked to isolate 6 lsbs.

uint64_t endSquareFinder(uint16_t a){
    a &= (0x003f);
    return 0x0000000000000001 << a;
}

All inputs work except for when a = 0x001f, the output of the function is 0xffffffff80000000 rather than 0x0000000080000000. This to me is very bizare.

gdb compiler


Solution

  • You need to make 1 into an unsigned 64 bit integer. Right now it's an int ...

    #include <type_traits>
    // this passes:
    static_assert(std::is_same_v<decltype(0x0000000000000001), int>);
    

    ... which is most likely only 32 bits.

    Example:

    return std::uint_least64_t(1) << a;
    // or
    return 1ull << a; // "unsigned long long int" is at least 64 bits