numpymatrixneural-networkrowwisemean-square-error

How to apply Mean Square Error row-wise in Python using NumPy without looping


I'm building a primitive neural network to emulate AND gate. The loss-fucntion is MSE:

def mse(predicted, desired): 
    return np.square(np.subtract(predicted, desired)).mean()

In the following there are a prediction, and the desired outputs (a.k.a. labels):

predicted = np.array
    ([[0.5000, 0.5000],   # 0 AND 0
      [0.4721, 0.5279],   # 0 AND 1
      [0.3049, 0.6951],   # 1 AND 0
      [0.3345, 0.6655]])  # 1 AND 1

desired = np.array
    ([[1, 0],   # False
      [1, 0],   # False
      [1, 0],   # False
      [0, 1]])  # True

Each row (in both of the above matrices) indicates a single case. I want to keep all the cases to be held together like this, rather than splitting them into vectors. The catch is, I need to treat each row individually.

I'm trying to get the following result, but yet I couldn't:

returned output = 
    [0.2500,  # 1st CASE ERROR
     0.2786,  # 2nd CASE ERROR
     0.4831,  # 3rd CASE ERROR 
     0.1118]  # 4th CASE ERROR

I tried the following function...

np.apply_along_axis(mse, 1, predicted, desired)

but it didn't work because "desire" is being passed as the whole matrix, rather than a row at a time. So, is there any way to achieve that without changing "mse function" implementation or loops?


Solution

  • Because all your data is in nicely formed ndarrays you can make NumPy do all the heavy lifting. In this case you can convert your for loop into a reduction along one of the array dimensions.

    np.square(np.subtract(predicted, desired)).mean(1)
    

    or

    ((predicted-desired)**2).mean(1)
    

    which is more readable IMO.