pythonneural-networkmini-batch

mini batch backpropagation clarification


Read through a lot of articles and now I am in my braindrain end, and need a fresh perspective on the concept of mini batch. I am new to machine learning and would appreciate any advice on whether my process is correct. Here is my premise:

I have a dataset of 355 features and 8 class output. A total of 12200 data. This is a rough visualization of my neural network: Neural Network Sketch

I decided on 181 neurons for Hidden layer 1, 96 neurons for Hidden Layer 2. I used ReLu activation for the hidden layers and Logistic for the output layer.

To do the mini-batch, I set my batch size to 8. So I have a total of 1525 batch with 8 dataset per batch. Here is my step:

  1. Get 1st Batch of data (8 sets of 355 inputs and 8 outputs).
  2. Forward Propagation of the Batch.
  3. Get the Errors and calculate the sum of squares of error. For the sum of squares, I averaged the errors of the batch first, used the formula SumError = (1/8)*sum(error^2)
  4. Back Propagation of the Batch
  5. Get the average of the weights value after back propagation.
  6. Use the new weights as the weights for the next batch.
  7. Get next Batch of Data (8 sets of 355 inputs and 8 outputs).
  8. Repeat 2-7 using the new set of weights.
  9. When all batch is done, get the average of the SumError to get the Sum of Squares per epoch.
  10. Repeat 1-9 until SumError per epoch is small.
  11. Get the final weights to be used for validation

That is the process of my mini batch. Is this correct? I mean, for the weights, do I use the weights calculated after each batch as the input weights for the next batch, or do I collect all the weights first (the starting weights will be used for all the batch), and then average the weights for all the batches? Then use the average weights as input to the next epoch?


Solution

  • Actually, u have to define your epoch, and each epoch should spread all your input data once at least(not only 2-7times).And after one epoch has one weight updated and repeat the steps until finish all epoch.