rneural-networkkeras-rl

How to control learning rate in KerasR in R


To fit a classification model in R, have been using library(KerasR). To control learning rate and KerasR says

compile(optimizer=Adam(lr = 0.001, beta_1 = 0.9, beta_2 = 0.999, epsilon = 1e-08, decay = 0, clipnorm = -1, clipvalue = -1), loss      = 'binary_crossentropy', metrics   =  c('categorical_accuracy') )

But it is given me an error like this

Error in modules$keras.optimizers$Adam(lr = lr, beta_1 = beta_2, beta_2 = beta_2, : attempt to apply non-function

I also used keras_compile still getting the same error. I can change optimizer in compile but the largest learning rate is 0.01, I want to try 0.2.

model <- keras_model_sequential()

model %>% layer_dense(units = 512, activation = 'relu',  input_shape =  ncol(X_train)) %>% 
  layer_dropout(rate = 0.2) %>% 
  layer_dense(units = 128, activation = 'relu')%>%
  layer_dropout(rate = 0.1) %>% 
  layer_dense(units = 2, activation = 'sigmoid')%>%
compile( 
  optimizer = 'Adam', 
  loss      = 'binary_crossentropy',
  metrics   =  c('categorical_accuracy') 
)

Solution

  • I think the issue is you are using two different libraries kerasR and keras together. You should use only one of them. First, you are using keras_model_sequential function which is from keras and then you try to use Adam function which is from kerasR library. You find the difference between these two libraries here: https://www.datacamp.com/community/tutorials/keras-r-deep-learning#differences

    The following code is working for me which is using only keras library.

    library(keras)
    model <- keras_model_sequential()
    
    model %>% 
      layer_dense(units = 512, activation = 'relu',  input_shape =  ncol(X_train)) %>% 
      layer_dropout(rate = 0.2) %>% 
      layer_dense(units = 128, activation = 'relu')%>%
      layer_dropout(rate = 0.1) %>% 
      layer_dense(units = 2, activation = 'sigmoid')%>%
      compile(optimizer=optimizer_adam(lr = 0.2), loss= 'binary_crossentropy', metrics   =  c('accuracy') )