machine-learningxgboostearly-stopping

Will XGBoost early stopping stop after marginal improvements?


I know that early stopping will happen if we don't have any improvement (or drop in performance) in the last X rounds. i.e. we need at least one round in the last X with little improvement, in order to continue.

But I read here: https://www.kaggle.com/vincentf/early-stopping-for-xgboost-python the following in-code comment:

stops 50 iterations after marginal improvements or drop in performance on your hold out set
  1. So, it seems that marginal improvements is like no improvement, is it right ?
  2. If so, what is the value of marginal improvements ? Can we set it's value ?

Solution

  • The meaning of "marginal improvements" in the post you have linked to is quite unclear (and arguably misleading); here is the relevant information from the documentation:

    early_stopping_rounds (int) – Activates early stopping. Validation metric needs to improve at least once in every early_stopping_rounds round(s) to continue training.

    which clearly does not support the "marginal improvements" claim.