scikit-learnprecisionprecision-recall

Is it possible for both recall and precision to be zeros?


I am trying to evaluate the model performance but I get zeros in one class for both precision and recall (the data is imbalanced with multiple classes > 20 class)

so , Is it possible for both recall and precision to be zeros on test data?


Solution

  • Sure, why not. That just means there are no true positives for this class. Consider this simplified example:

    from sklearn.metrics import confusion_matrix, classification_report
    
    y_true = [0,0,0,1,0,0,0,1,0,1]
    y_pred = [0,0,0,0,1,0,0,0,1,0]
    
    confusion_matrix(y_true, y_pred)
    
    >>> array([[5, 2],
    >>>        [3, 0]])
    
    print(classification_report(y_true, y_pred))
    
    >>>              precision    recall  f1-score   support
    
    >>>        0       0.62      0.71      0.67         7
    >>>        1       0.00      0.00      0.00         3
    
    >>> accuracy                           0.50        10
    >>> macro avg      0.31      0.36      0.33        10
    >>> weighted avg   0.44      0.50      0.47        10