Uninitialized Value Error While Using Adadelta Optimizer In Tensorflow
Solution 1:
The problem here is that tf.initialize_all_variables()
is a misleading name. It really means "return an operation that initializes all variables that have already been created (in the default graph)". When you call tf.train.AdadeltaOptimizer(...).minimize()
, TensorFlow creates additional variables, which are not covered by the init
op that you created earlier.
Moving the line:
init = tf.initialize_all_variables()
...after the construction of the tf.train.AdadeltaOptimizer
should make your program work.
N.B. Your program rebuilds the entire network, apart from the variables, on each training step. This is likely to be very inefficient, and the Adadelta algorithm will not adapt as expected because its state is recreated on each step. I would strongly recommend moving the code from the definition of batch_xs
to the creation of the optimizer
outside of the two nested for
loops. You should define tf.placeholder()
ops for the batch_xs
and batch_ys
inputs, and use the feed_dict
argument to sess.run()
to pass in the values returned by mnist.train.next_batch()
.
Post a Comment for "Uninitialized Value Error While Using Adadelta Optimizer In Tensorflow"