kerasneural-networkregressionlossmultipleoutputs

Regression Loss function for Multi outputs Keras


I'm using deep learning approach to address a regression problem with multi outputs (16 outputs), each output is between [0,1] and the sum is 1. I am confused about which loss function is ideal to this problem, I have already test Mean squared error and Mean Absolute Error but Neural network predicts always the same value.

model = applications.VGG16(include_top=False, weights = None, input_shape = (256, 256, 3))

x = model.output
x = Flatten()(x)
x = Dense(1024)(x)
x=BatchNormalization()(x)
x = Activation("relu")(x)
x = Dropout(0.5)(x)
x = Dense(512)(x)
x=BatchNormalization()(x)
x = Activation("relu")(x)
x = Dropout(0.5)(x)

predictions = Dense(16,activation="sigmoid")(x)


model_final = Model(input = model.input, output = predictions)


model_final.compile(loss ='mse', optimizer = Adam(lr=0.1), metrics=['mae'])

Solution

  • What you are describing sounds more like a classification task, since you want to get a probability distribution at the end. Therefore you should use a softmax (for example) in the last layer and cross-entropy as loss measure.