So I created my own neural network and I want to do an automatic differentiation for it with respect to the input variable. My code for the neural network goes like this
n_input = 1
n_hidden_1 = 50
n_hidden_2 = 50
n_output = 1
weights = {
'h1': tf.Variable(tf.random.normal([n_input, n_hidden_1],0,0.5)),
'h2': tf.Variable(tf.random.normal([n_hidden_1, n_hidden_2],0,0.5)),
'out': tf.Variable(tf.random.normal([n_hidden_2, n_output],0,0.5))
}
biases = {
'b1': tf.Variable(tf.random.normal([n_hidden_1],0,0.5)),
'b2': tf.Variable(tf.random.normal([n_hidden_2],0,0.5)),
'out': tf.Variable(tf.random.normal([n_output],0,0.5))
}
def multilayer_perceptron(x):
x = np.array([[[x]]], dtype='float32')
layer_1 = tf.add(tf.matmul(x, weights['h1']), biases['b1'])
layer_1 = tf.nn.tanh(layer_1)
layer_2 = tf.add(tf.matmul(layer_1, weights['h2']), biases['b2'])
layer_2 = tf.nn.tanh(layer_2)
output = tf.matmul(layer_2, weights['out']) + biases['out']
return output
And with tf.GradientTape()
, I tried to differentiate the neural network with this
x = tf.Variable(1.0)
with tf.GradientTape() as tape:
y = multilayer_perceptron(x)
dNN1 = tape.gradient(y,x)
print(dNN1)
Which results None
. What did I do wrong here?
For a good running of some tensorflow operations, it's preferable that all elements of operations are of type tf.tensor, you have to reshape using
def multilayer_perceptron(x):
x = tf.reshape(x , (1,1,1))