I noticed that assigning the keras.initializer inside a layer results in an error claiming that the variable initializers must either be wrapped in an init_scope
or callable. However, I fail to see how my below use-case is different than the example provided here. Is there a workaround of this issue or am I making some obvious error in using keras initializers?
Here is the minimal example that I could come up with:
import tensorflow as tf
from keras.models import *
from keras.layers import *
from keras.optimizers import *
from tensorflow.keras import initializers
inputs_test=Input((512,512,3))
initializer_truncated_norm = initializers.TruncatedNormal(mean=0, stddev=0.02)
deconv_filter = Conv2DTranspose(3, (2, 2), strides=(2, 2), padding='same', kernel_initializer=initializer_truncated_norm)(inputs_test)
model2 = Model(inputs=inputs_test, outputs=deconv_filter)
optimizer = Adam(lr=1e-4)
model2.compile(optimizer=optimizer, loss='mse')
model2.summary()
Here is the exact error that I get when running such function
ValueError: Tensor-typed variable initializers must either be wrapped in an init_scope or callable (e.g.,
tf.Variable(lambda : tf.truncated_normal([10, 40]))
) when building functions. Please file a feature request if this restriction inconveniences you.
I ran your example without errors. tf==2.3.1, keras==2.4.0
.
Also tried it with tf==2.0
in Colab and it worked OK again. I suggest you to upgrade to the latest TF and try again.
Also change your imports from from keras
to from tensorflow.keras
If still fails - post the full error stack here and version info.