I have jupyter notebook mess that I am attempting to refactor into proper unit tests. I would like to copy-paste some output from actual data into my test cases. However, my data are long gnarly fractional numbers. For conciseness, I would prefer to have less verbose values. I cannot, however, simply round my values, as my values span many orders of magnitude, and it is not the case that only certain orders of magnitude are important. What would be great is to round only the mantissa. For example:
3.5876 e -4 --> 3.6 e -4
2.1234 e 5 --> 2.1 e 5
What is the easiest way to round my values in this way in pytorch?
Do you need something like this?
import torch
v1 = torch.as_tensor(3.5876e-4)
v2 = torch.as_tensor(2.1234e+5)
decimals = 1
def mround(v, decimals):
return torch.round(v, decimals=decimals-int(torch.floor(torch.log10(v)).item()))
v1r = mround(v1, decimals);
v2r = mround(v2, decimals);
print(f"{v1:.4e} --> {v1r:.4e} --> {v1r:.{decimals}e}")
print(f"{v2:.4e} --> {v2r:.4e} --> {v2r:.{decimals}e}")
Result:
3.5876e-04 --> 3.6000e-04 --> 3.6e-04
2.1234e+05 --> 2.1000e+05 --> 2.1e+05
Rounding the mantissa to any decimal place will not make it more compact, because it is binary. And rounded benchmark results in notepad will force you to always round up test results before comparing.