c++classtype-conversionbit-fieldsuint16

type conversion from int to class behaving weirdly


So. I am trying to convert a uint16_t (16 byte int) to class. To get the class member varaible. But it is not working as expected.


class test{
public:
    uint8_t m_pcp : 3; // Defining max size as 3 bytes
    bool m_dei : 1;
    uint16_t m_vid : 12; // Defining max size as 12 bytes

public:
    test(uint16_t vid, uint8_t pcp=0, bool dei=0) {
        m_vid = vid;
        m_pcp = pcp;
        m_dei = dei;
    }
};

int main() {
    uint16_t tci = 65535;
    test t = (test)tci;

    cout<<"pcp: "<<t.m_pcp<<" dei: "<<t.m_dei<<" vid "<<t.m_vid<<"\n";
    return 0;
}

Expected output:

pcp:1  dei: 1 vid 4095

The actual output:

pcp:  dei: 0 vid 4095

Also,

cout<<sizeof(t)

returns 2. shouldn't it be 4?

Am I doing something wrong?


Solution

  • test t = (test)tci;
    

    This line does not perform the cast you expect (which would be a reinterpret_cast, but it would not compile). It simply calls your constructor with the default values. So m_vid is assigned 65535 truncated to 12 bits, and m_pcp and m_dei are assigned 0. Try removing the constructor to see that it does not compile.

    The only way I know to do what you want is to write a correct constructor, like so:

    test(uint16_t i) {
        m_vid = i & 0x0fff;
        i >>= 12;
        m_dei = i & 0x1;
        i >>= 1;
        m_pcp = i & 0x7;
    }
    

    Demo

    Also I'm not sure why you would expect m_pcp to be 1, since the 3 highest bits of 65535 make 7.

    Also, cout<<sizeof(t) returns 2. shouldn't it be 4?

    No, 3+1+12=16 bits make 2 bytes.