I am trying to replicate a custom CRC-16 algorithm implemented in MATLAB, in Python. The original MATLAB code uses the comm.CRCGenerator object with the polynomial 'X^16 + X^12 + X^5 + 1'.
The MATLAB code and its output are as follows:
matlab
crc_generator = comm.CRCGenerator('Polynomial','X^16 + X^12 + X^5 + 1', 'DirectMethod', false);
data = [1,2,3,4,5];
input_data = data(1:end-2);
seq = crc_generator(reshape(logical(de2bi(input_data, 8, 'left-msb'))', [], 1));
The output of the MATLAB code for the input data [1,2,3,4,5] is a binary vector:
matlab
seq =
40×1 logical array
0
0
0
0
0
0
0
1
0
0
0
0
0
0
1
0
0
0
0
0
0
0
1
1
0
1
1
0
0
0
0
1
0
0
1
1
0
0
0
1
I'm attempting to replicate this in Python using the crcmod library. Here's the equivalent Python code and its output:
python
import crcmod
crc16_func = crcmod.mkCrcFun(0x11021, initCrc=0, xorOut=0xFFFF)
data = [1,2,3,4,5]
input_data = data[:-2]
bit_stream = ''.join(format(byte, '08b') for byte in input_data)
seq = crc16_func(bit_stream.encode('utf-8'))
seq_bin = format(seq, '016b')
The output from the Python code is seq = 39537 and seq_bin = '1001101001110001'.
I'm unable to get the same binary sequence result in Python as in MATLAB, even after appropriately converting the data and using the same polynomial for the CRC calculation. Could anyone guide me on how to implement this custom CRC-16 in Python to match MATLAB's comm.CRCGenerator behavior?
Any help would be greatly appreciated. Thank you!
This:
import crcmod
crc16_func = crcmod.mkCrcFun(0x11021, 0, False, 0)
crc = crc16_func(bytes([1, 2, 3]))
print(hex(crc))
prints
0x6131
which is the last 16 bits of your example generated in MATLAB (0110000100110001
).