Here are two similar constraint blocks, one written using decimal notation, and the other using hexadecimal notation. The first works as expected, but the second only generates positive values (including 0) out of the 5 available values:
-- positive and negative values generated as expected
var rnd_byte : int(bits: 8);
for i from 0 to 9 {
gen rnd_byte keeping {
soft it == select {
90 : [-1, -128 , 127, 1];
10 : 0x00;
};
};
print rnd_byte;
};
-- only positive values (including 0) generated!!!
var rnd_byte : int(bits: 8);
for i from 0 to 9 {
gen rnd_byte keeping {
soft it == select {
90 : [0xFF, 0x80, 0x7F, 0x01];
10 : 0x00;
};
};
print rnd_byte;
};
How can I make the second example behave as the first one, but keep the hexadecimal notation. I don't want to write large decimal numbers.
some more about this issue - with procedural code there is auto casting. so you can write
var rnd_byte : int( bits : 8);
rnd_byte = 0xff;
and it will result with rnd_byte == -1.
constraints work with int (bits :8 ) semantics, and this code would fail:
var rnd_byte : int( bits : 8);
gen rnd_byte keeping {it == 0xff};
as suggested - for getting 0xff - define the field as unsigned.