I am trying to generate a torch vector with a specific length. I want the vector to have the same beginning elements when increasing its length using the same seed.
This works when the vector's length ranges from 1 to 15 for example:
For length 14
torch.manual_seed(1)
torch.randn(14)
tensor([ 0.6614, 0.2669, 0.0617, 0.6213, -0.4519, -0.1661, -1.5228, 0.3817,
-1.0276, -0.5631, -0.8923, -0.0583, -0.1955, -0.9656])
For length 15
torch.manual_seed(1)
torch.randn(15)
tensor([ 0.6614, 0.2669, 0.0617, 0.6213, -0.4519, -0.1661, -1.5228, 0.3817,
-1.0276, -0.5631, -0.8923, -0.0583, -0.1955, -0.9656])
But for length 16 I get a totaly different vector
torch.manual_seed(1)
torch.randn(16)
tensor([-1.5256, -0.7502, -0.6540, -1.6095, -0.1002, -0.6092, -0.9798, -1.6091,
-0.7121, 0.3037, -0.7773, -0.2515, -0.2223, 1.6871, 0.2284, 0.4676])
Can someone explain what's happennig to me and could I get a solution where the vector does not changes?
This is due to the behaviour of PRNG. Different code paths might be used. There is no guarantee that all different sequence length will produce exactly the same output samples from the PRNG.
The outputs from 1-15 match while starting from 16 another (probably vectorized) code path will be used. Changes in the sequence length could dispatch to faster code paths
Source: Odd result using multinomial num_samples...
It seems that this may not be an issue in downgraded torch versions.