pythonmachine-learningdeep-learningneural-networkmini-batch

What is the right way of mini-batching the validation set while training?


I am training a neural network. For training I get 80% of my data and divide it to a number of mini-batches. I train on each mini batch, then update parameters, until all data is visited. I repeat the whole procedure for a number of epochs.

The question is about the remaining 10%+10% of data: how to change the validation set during this process? Should I use rotating mini batches for validation set as well?


Solution

  • I think this question is more or less answered here: What is the meaning of batch_size for validation?

    Since you don't train the model anymore - it does not affect the results. In other words, since you don't apply Mini-Batch Gradient Descent while validating your model with the validation set, it does not really matter. It may have an impact memory-wise though.