I am using TensorFlow 1.12 in eager execution, and I want to inspect the values of my gradients and my weights at different points during training for debugging purposes. This answer uses TensorBoard to get nice graphs of weight and gradient distribution over epochs, which is what I would like. However, when I use Keras' TensorBoard callback, I get this:
WARNING:tensorflow:Weight and gradient histograms not supported for eagerexecution, setting `histogram_freq` to `0`.
In other words, this is not compatible with eager execution. Is there any other way to print gradients and/or weigths? Most non-TensorBoard answers seem to rely on graph-based execution.
In eager execution, you can directly print the weights. As for the gradients, you can use tf.GradientTape to get the gradients of the loss function with respect to some weights. Here is an example showing how to print gradients and weights:
import tensorflow as tf
tf.enable_eager_execution()
x = tf.ones(shape=(4, 3))
y = tf.ones(shape=(4, 1))
dense = tf.layers.Dense(1)
# Print gradients
with tf.GradientTape() as t:
h = dense(x)
loss = tf.losses.mean_squared_error(y, h)
gradients = t.gradient(loss, dense.kernel)
print('Gradients: ', gradients)
# Print weights
weights = dense.get_weights()
print('Weights: ', weights)