tensorflowdeep-learning

Does make sense use dynamic learning rate in AdamOptimizer?


I'm dvelopping a convolutional neural network for images recognition based on three own classes. I built an AlexNet-based model to train. I'd like to know two things:

  1. AdamOptimizer performs a learning rate decay internally (from a fixed given value) or not ?
  2. In case of not, can I use tf.train.exponential_decay to perform decay ?

Small examples are apprecciated. Thanks


Solution

  • As you can see in adam.py AdamOptimizer will adjust its learning rate.

    The learning rate you pass to the constructor just gives the initial value to start with.

    So yes, it does not make much sense to use exponential decay on AdamOptimizer but on gradient descent or momentum optimizer. See here for an example.