I am interested, which is the optimal way of calculating the number of bits set in byte by this way
template< unsigned char byte > class BITS_SET
{
public:
enum {
B0 = (byte & 0x01) ? 1:0,
B1 = (byte & 0x02) ? 1:0,
B2 = (byte & 0x04) ? 1:0,
B3 = (byte & 0x08) ? 1:0,
B4 = (byte & 0x10) ? 1:0,
B5 = (byte & 0x20) ? 1:0,
B6 = (byte & 0x40) ? 1:0,
B7 = (byte & 0x80) ? 1:0
};
public:
enum{RESULT = B0+B1+B2+B3+B4+B5+B6+B7};
};
Maybe it is optimal when value of byte is known at run-time? Is it recommended use this in code?
For 8-bit values, just use a 256-element lookup table.
For larger sized inputs, it's slightly less trivial. Sean Eron Anderson has several different functions for this on his Bit Twiddling Hacks page, all with different performance characteristics. There is not one be-all-end-all-fastest version, since it depends on the nature of your processor (pipeline depth, branch predictor, cache size, etc.) and the data you're using.