Liu Sida's Homepage

Machine Learning & Human Learning

Use Tensorflow to Compute Gradient

In most of Tensorflow tutorials, we use minimize(loss) to automatically update parameters of the model.

In fact, minimize() is an integration of two steps: computing gradients, and applying the gradients to update parameters.

Let’s take a look at an example:

What is the gradient of W and B when W=1.0, B=1.0?

We can calculate them by hand:

let $N = 100 - 3W - B$, so that $Y = N^2$

ok, now let use tensorflow to compute that:

import tensorflow as tf

# make an example:
# Y = (100 - W X - B)^2
X = tf.constant(3.)
W = tf.Variable(1.)
B = tf.Variable(1.)
Y = tf.square(100 - W*X - B)

#the lr here is not about gradient computing. it only effect when appling

 [(-576.0, 1.0), (-192.0, 1.0)]