The stdint.h
header lacks an int_fastest_t
and uint_fastest_t
to correspond with the {,u}int_fastX_t
types. For instances where the width of the integer type does not matter, how does one pick the integer type that allows processing the greatest quantity of bits with the least penalty to performance? For example, if one was searching for the first set bit in a buffer using a naive approach, a loop such as this might be considered:
// return the bit offset of the first 1 bit
size_t find_first_bit_set(void const *const buf)
{
uint_fastest_t const *p = buf; // use the fastest type for comparison to zero
for (; *p == 0; ++p); // inc p while no bits are set
// return offset of first bit set
return (p - buf) * sizeof(*p) * CHAR_BIT + ffsX(*p) - 1;
}
Naturally, using char
would result in more operations than int
. But long long
might result in more expensive operations than the overhead of using int
on a 32 bit system and so on.
My current assumption is for the mainstream architectures, the use of long
is the safest bet: It's 32 bit on 32 bit systems, and 64 bit on 64 bit systems.
For all existing mainstream architectures long
is the fastest type at present for loop throughput.