As asked in the title, i would like to know if it is possible to make a model early-stop the epochs during training when the error is reduced enough, so i can avoid overfitting and guessing the right number of epochs at each call.
This is the only thing i have found in the official documentation but it is tobe used in brainscript, and i don't know a single thing about it. I'm using Python 3.6 with CNTK 2.6.
Also, is there a way to perform cross validation in a CNTK CNN?? How could this be done?
Thanks in advance.
The CrossValidationConfig class tells CNTK to periodically evaluate the model on a validation data set, and then call a user-specified callback function, which then can be used to update the learning rate or to return False to indicate early stopping.
For examples on how to implement early stopping: