neural network - how to feed data in batches TensorFlow CNN? -
almost examples on github or other blogs uses mnist
dataset demo. when trying use same deep nn images data encounter following problem.
they use:
batch_x, batch_y = mnist.train.next_batch(batch_size) # run optimization op (backprop) sess.run(train_op, feed_dict={x: trainimg, y: trainlabel, keep_prob: 0.8})
next_batch
method feed data in batches.
my question is:
do have similar method feed data in batches?
you should have @ tf.contrib.data.dataset
. can create input pipeline: define source, apply transforation, , batch it. see programmer's guide importing data.
from documentation:
the dataset api enables build complex input pipelines simple, reusable pieces. example, pipeline image model might aggregate data files in distributed file system, apply random perturbations each image, , merge randomly selected images batch training
edit:
i guess have array of pictures (filenames). here example programmer's guide.
depending on input files, transformation part change. here extract consuming array of picture files.
# reads image file, decodes dense tensor, , resizes # fixed shape. def _parse_function(filename, label): image_string = tf.read_file(filename) image_decoded = tf.image.decode_image(image_string) image_resized = tf.image.resize_images(image_decoded, [28, 28]) return image_resized, label # vector of filenames. filenames = tf.constant(["/var/data/image1.jpg", "/var/data/image2.jpg", ...]) # labels[i] label image in filenames[i]. labels = tf.constant([0, 37, ...]) dataset = tf.contrib.data.dataset.from_tensor_slices((filenames, labels)) dataset = dataset.map(_parse_function) # have dataset of (image, label). kind of list # pictures encoded along label. # batch it. dataset = dataset.batch(32) # create iterator. iterator = dataset.make_one_shot_iterator() # retrieve next element. image_batch, label_batch = iterator.get_next()
you shuffle images.
now can use image_batch
, label_batch
placeholders in model definition.
Comments
Post a Comment