I am wondering if there is a way to access the current value chosen by hyperopt for parameters? I would like to use its selected value in a learning rate callback function for xgboost.
from hyperopt import hp
param = {'eta' : hp.uniform('eta', 0.01, 0.1)} # learning rate
param['eta'] # returns <hyperopt.pyll.base.Apply at 0x23fd5699dd8>
You'll get the value for 'eta' for each iteration of your objective function when using fmin.
E.g.
_ = fmin(fn=objective,
space=param,
max_evals=num_trials)
And objective is defined as:
def objective(params: Dict):
# So you can access params['eta'] in this context