neural network - how / where weights and bias get value/updates in tensorflow -
from tensorflow.examples.tutorials.mnist import input_data mnist = input_data.read_data_sets("mnist_data/", one_hot=true) import tensorflow tf x = tf.placeholder(tf.float32, [none, 784]) w = tf.variable(tf.zeros([784, 10])) #weights! b = tf.variable(tf.zeros([10])) # bias! y = tf.nn.softmax(tf.matmul(x,w)+b) y_ = tf.placeholder(tf.float32, [none,10]) cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_*tf.log(y), reduction_indices=1)) train = tf.train.gradientdescentoptimizer(0.01).minimize(cross_entropy) sess= tf.interactivesession() tf.global_variables_initializer().run() in range(10): xs,ys = mnist.train.next_batch(100) sess.run(train, {x: xs,y_:ys}) print(sess.run(w,{x: xs,y_:ys} )) cross_validation = tf.equal(tf.argmax(y,1), tf.argmax(y_,1)) acu = tf.reduce_mean(tf.cast(cross_validation, tf.float32)) print(sess.run(acu, {x:mnist.test.images, y_: mnist.test.labels}))
this tensorflow tutorial code. understand completely, don't understand intuition behind weight , bias updates.
how see it:
matmul calculates x * w tf.zero , bias tf.zero well.. result first epoch forward propagation zero. 1*0+0 = 0. tensorflow adds random weights , bias value?
Comments
Post a Comment