I'm using Hydra to manage parameters configuration of my model, and I wanted to do hyperparameter tuning. I'm using two different optimizers, adam and sgd, and they have different parameters, so I wanted to use nested sweep to set those parameter; but I'm not sure if Hydra's Basic Sweeper does support Nested Tuning and, if so, how to do it. Documentation doesn't really say anything about it, but maybe I'm missing something.
defaults:
- dataset: file_cpet
- optimizer: ???
- model: tenet_lstm
- _self_
working_dir: ${hydra:runtime.cwd}
data_dir: ${working_dir}/data/
batch_size: 10
epochs: 150
hydra:
sweeper:
params:
optimizer: sgd, adam
optimizer.lr: 1e-2, 1.5e-2
sgd.yaml
defaults:
- sgd-tuning: momentum-tuning
- _self_
name: sgd
lr: 1.0e-2
nesterov: False
weight_decay: 0
hydra:
sweeper:
params:
sgd-tuning.momentum: 0.85, 0.9
momentum-tuning.yaml
momentum: 0.85
I tried to set it like this but it won't work; or rather, it performs the tuning when it's "Adam's" turn, but it doesn't do it when it's "SGD's" turn, and gives me this error:
Error executing job with overrides: ['optimizer=sgd', 'optimizer.lr=0.01']
Traceback (most recent call last):
File "c:\Users\Parlu\Desktop\respirazione\src\scripts\train_lstm.py", line 141, in main
m = cfg.optimizer.momentum
omegaconf.errors.ConfigAttributeError: Key 'momentum' is not in struct
full_key: optimizer.momentum
object_type=dict
This is not supported. You can sweep once for each optimizer or simply create a top-level script that will call both sweeps.