I'm training a unet neural network. During training, each iteration has a "loss value". This value generally converges, but sometimes jumps around. What weights are finally saved in the .caffemodel
file?
What happens if I save it at iteration 20000, and that just so happens to be a point where the loss jumped up a bit, and isn't the lowest loss that it has seen? Are the weights and biases saved from the last iteration or something smarter like the lowest of last 5% iterations?
Thank you
Solver.prototxt has one parameter called "snapshot"
net: "path/to/train.prototxt"
.
.
max_iter: 20000
snapshot: 1000
snapshot_prefix: "path/to/caffemodel/"
solver_mode: GPU
For example, if you fix snapshot: 1000, then each 1000 iterations it will be saved one file .caffemodel with the weights corresponding to that iteration, regardless of whether the loss was less in the previous iteration.