Tensorflow: Which graph statements are executed after the graph is built?

In Tensorflow, which statements within a graph definition block are executed only to build the graph vs. which are executed during training? For example:

with tf.Graph().as_default():
    weightsLayer1 = tf.Variable(tf.truncated_normal([nInputUnits, nOutputUnits]))
    weightsLayer1 = tf.div(weightsLayer1, tf.sqrt(tf.to_float(nInputUnits)))
    biasesLayer1 = tf.Variable(tf.zeros([nUnitsHiddenLayer1]))
    layer1output = tf.tanh(tf.matmul(images_placeholder, weightsLayer1) + biasesLayer1)

Intuitively, the lines defining weightsLayer1 and biasesLayer1 I assume are only executed once at startup, since they initialize weights and biases. However, the line computing layer1output I assume executes at every training step, since layer1output is used downstream to compute loss, which is minimized by the optimizer. So, how does Tensorflow know, during training, to only execute the last line and not the previous ones (which would re-initialize the weights and biases)?


You as the user are actually telling tensorflow which operations to run. During training, you typically tell tensorflow to execute operations that are provided by an optimizer. This looks something like this:

opt = tf.train.GradientDescentOptimizer(0.01)
train_step = opt.minimize(loss) #
for i in range(100):
    sess.run(train_step, feed_dict=...)

Calling opt.minimize adds to the computation graphs the gradients wrt the trainable variables as well as operations that update the variables using the gradients. train_step is in fact these update operations grouped using tf.group . If you (the user) run train_step , tensorflow figures out what parts of the computation graph it needs to run in order to execute these desired operations.

Likewise, if you do something like sess.run(fetches=loss, feed_dict=...) , you are asking tensorflow to execute all operations in the graph that are necessary to compute loss .

Finally, initialization operations like the one in weightsLayer1 = tf.Variable(tf.truncated_normal([nInputUnits, nOutputUnits])) are usually run by sess.run(tf.initialize_all_variables()) .

Edit: After re-reading your question, I want to be more clear about one aspect. No operations are actually executed by the graph definition code you provided. Tensorflow operations are executed if and only if you start a session and request the execution of parts of your graph. As stated above, that includes the initialization operations.

链接地址: http://www.djcxy.com/p/32122.html

上一篇: 如何在tensorflow上改进我的LSTM代码?

下一篇: Tensorflow:图形生成后执行哪些图形语句?