matlabneural-networkpredictionactivation-functionfeed-forward

After training, is the prediction of the neural network achieved by just forward propagating from input to output layer?


I made a simple feedforward neural net in matlab as follows:

mynet = feedforwardnet(5)
mynet.layers{1}.transferFcn = 'poslin'; % one hidden layer(5 neurons) with poslin = ReLU activation function
mynet.layers{2}.transferFcn = 'purelin'; % last layer has simply linear activation function 

I want to train this Neural Network to learn a non-linear function that looks like this: Original function. So basically it is a regression problem. We have two inputs(u1, u2), and one output(y).

The neural net is trained and now to estimate the output, we can simply do:

input = [3;2] % u1 = 3, u2 = 2
y_predicted = mynet([input]) % gives the output for a input

This gives y_predicted = 2.9155. Okay, that is fine. The prediction is good (Since, y_true = 3). But I don't understand how this value came.

And then when I manually checked it by forward propagation, I got different result. That is, I extracted the final weights and biases after training by:

W1 = mynet.IW{1,1}; b1 = mynet.b{1}; W2 = mynet.LW{2,1}; b2 = mynet.b{2}

Then did forward propagation:

Z1 = W1*[3; 2] + b1; 
A1 = poslin(Z1); % applying ReLU activation function 
Z2 = W2*A1 + b2;
A2 = Z2; % linear activation function
y_predicted = A2;

Now I get (y_predicted = 2.2549). Not that of a good prediction here, but I understand how this value came.

Shouldn't both predicted values be the same? Am I missing something?


Solution

  • This was happening due to the preprocessing of the input and output, which happens by default when you create a feedforward net in Matlab. So when checking it manually, we were not taking this into account and giving direct the raw data.

    Reference

    The problem was resolved after deactivating them by:

    mynet.input.processFcns = {}; 
    mynet.output.processFcns = {};