I have the following issue when running the following code,
hparams = HyperParams()
gen = generator()
enc = encoder()
dec = decoder()
gen_opt = Flux.setup(Flux.Adam(hparams.lr_gen), gen)
enc_opt = Flux.setup(Flux.Adam(hparams.lr_enc), enc)
dec_opt = Flux.setup(Flux.Adam(hparams.lr_dec), dec)
losses_gen = []
losses_dscr = []
train_steps = 0
# Training loop
gen_ps = Flux.params(gen)
enc_ps = Flux.params(enc)
dec_ps = Flux.params(dec)
...
ERROR: UndefVarError: `setup` not defined
Stacktrace:
[1] getproperty(xWARNING: both Losses and NNlib export "ctc_loss"; uses of it in module Flux must be qualified
::Module, f::Symbol)
@ Base ./Base.jl:31
[2] top-level scope
@ ~/github/AdaptativeBlockLearning/examples/MMD_GAN/mmd_gan_1d.jl:65
I am using Julia v1.9.0
, and Flux v0.12.10
. Can someone indicate to me what the new way of defining the optimizer is, and in which version of Flux has it changed?
Certainly, updating the Flux version to Flux v0.13.17
the setup functionality is accepted.