I work with Azure Machine Learning Service for modeling. To track and analyze the result of a binary classification problem, I use a method named score-classification in azureml.training.tabular.score.scoring library. I invoke the method like this:
metrics = score_classification(
y_test, y_pred_probs, metrics_names_list, class_labels, train_labels, sample_weight=sample_weights, use_binary=True)
Input arguments are:
When it calculates the metrics I sent as metrics_names_list, the results are shown in the Azure ML portal in the metrics page.
Confusion matrix is one of the metrics I draw each time. It has a combo box for the representation. This combo box could be set as Raw to show the number of items for each cell, and Normalized to show the percentage of the cells.
The problem is that I see float value instead of integer ones for the Raw configuration of this matrix! I do not know how to handle this issue?
If you are using sample weights, the confusion matrix values will be calculated as the sum of the sample weights for each cell, which could result in float values. If you want to see integer values in the confusion matrix, you could try not passing any sample weights to the score_classification
method.
classification_metrics = list(constants.CLASSIFICATION_SCALAR_SET)
scores = scoring.score_classification(
y_test_df.values, predicted, classification_metrics, class_labels, train_labels
)