I'm trying to calculate F1 score in a tf.Estimator
setup.
I've seen this SO question, but couldn't distill a working solution from it.
The thing with tf.Estimator
is that it expects me to deliver a value and an update op, so right now, I have this piece of code at the end of my model:
if mode == tf.estimator.ModeKeys.EVAL:
with tf.variable_scope('eval'):
precision, precision_update_op = tf.metrics.precision(labels=labels,
predictions=predictions['class'],
name='precision')
recall, recall_update_op = tf.metrics.recall(labels=labels,
predictions=predictions['class'],
name='recall')
f1_score, f1_update_op = tf.metrics.mean((2 * precision * recall) / (precision + recall), name='f1_score')
eval_metric_ops = {
"precision": (precision, precision_update_op),
"recall": (recall, recall_update_op),
"f1_score": (f1_score, f1_update_op)}
Now the precision and recall seem to be working just fine, but on the F1 score, I keep getting nan
.
How should I go about getting this to work?
EDIT:
A working solution can be achieved with tf.contrib.metrics.f1_score
but since contrib
is going to be deprecated in TF 2.0, I'd appreciate a contrib
-less solution
TensorFlow addons already has an official solution
https://www.tensorflow.org/addons/api_docs/python/tfa/metrics/F1Score