I have started studying ML using keras
and tensorflow
and i wanted to train neural network with TSV-file
. However, my val_accuracy
is always same. TSV-file
have 1000 arguments for using in prediction and 1 output that can be only 0 or 1.
I've tried using different optimizers and losses, varying the number of Dense
layers, units, and activation functions. Here is my model
model = keras.Sequential([
keras.Input((1000,)),
normalize,
keras.layers.Dense(1000, activation=keras.activations.relu),
keras.layers.Dense(200, activation=keras.activations.relu),
keras.layers.Dense(50, activation=keras.activations.relu),
keras.layers.Dense(1, )
])
model.compile(optimizer='adam', loss=keras.losses.BinaryCrossentropy(from_logits=True), metrics=['accuracy'])
losses = model.fit(train_data, train_labels, epochs=80, validation_split=0.2)
after model.fit
i have this result:
Epoch 77/80
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.3374 - loss: nan - val_accuracy: 0.1786 - val_loss: nan
Epoch 78/80
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.3603 - loss: nan - val_accuracy: 0.1786 - val_loss: nan
Epoch 79/80
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.3041 - loss: nan - val_accuracy: 0.1786 - val_loss: nan
Epoch 80/80
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.3645 - loss: nan - val_accuracy: 0.1786 - val_loss: nan
where val_accuracy
is always 0.1786 and loss
nan. Maybe i dont understand how to use Dense
layers. So how can i fix this? For additional information u may check my code on Google Colab with TSV-file
The output from model.fit
indicates that it cannot calculate the loss. If the loss remains as nan
, the training process will not adjust the parameters, and consequently, the model will not improve its val_accuracy
.
I noticed in your source code that you are replacing "?" with None
:
train_data = np.array(data.map(lambda x: None if x == '?' else int(x)))
With these None
values, the model will be unable to calculate the output. I recommend filtering out lines with None
before using them for training.