deep-learningjuliaflux

How to adjust the learning rate of the optimiser in Flux Julia


As mentioned I tried to implement a learning rate decay for my neural network. I set up the model as follows:

nn = Chain(Dense(10,5),Dense(5,1))
opt = Adam(0.01)
opt_state = setup(opt, nn)

I tried to adjust the learning rate directly from opt_state. Then I found that Optimisers.Adam is an immutable struct in the opt_state but if I changed the eta in opt it worked fine.

opt_state.layers[1][:weight].rule.eta = 0.001 # ERROR: setfield!: immutable struct of type Adam cannot be changed
opt.eta = 0.001 # no error

Furthermore, accessing the learning rate like this doesn't look great. Is it possible to modify the learning rate of the opt without setting up a new opt_state?


Solution

  • You're looking for adjust!(opt_state, 0.001) described here: https://fluxml.ai/Optimisers.jl/dev/#Adjusting-Hyperparameters