scikit-learnbayesianhyperparametersscikit-optimize

BayesSearchCV parameters


I just read about Bayesian optimization and I want to try it.

I installed scikit-optimize and checked the API, and I'm confused:

  1. I read that Bayesian optimization starts with some initialize samples.

    • I can't see where I can change this number ? (BayesSearchCV)
    • n_points will change the number of parameter settings to sample in parallel and n_iter is the number of iterations (and if I'm not wrong the iterations can't run in parallel, the algorithm improve the parameters after every iteration)
  2. I read that we can use different acquisition functions. I can't see where I can change the acquisition function in BayesSearchCV ?


Solution

  • Is this something you are looking for?

    BayesSearchCV(..., optimizer_kwargs={'n_initial_points': 20, 'acq_func': 'gp_hedge'}, ...)
    

    skopt.Optimizer is the one actually doing the hyperparameter optimization.

    BayesSearchCV will build Optimzier with optimizer_kwargs parameters.

    https://github.com/scikit-optimize/scikit-optimize/blob/de32b5fd2205a1e58526f3cacd0422a26d315d0f/skopt/searchcv.py#L551