【发布时间】:2018-05-06 16:24:56
【问题描述】:
我是 tf.train.batch 的新手,所以我写了一个示例来测试它。当我运行代码时,我没有得到任何结果,并且该过程仍在运行。
你以前遇到过同样的情况吗?非常感谢!
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
import numpy as np
import tensorflow as tf
a = [[1,2,3,4],[1,2,3,4],[1,2,3,4],[1,2,3,4]]
b = [1,2,3,4]
input_queue = tf.train.slice_input_producer([a, b],num_epochs=None,shuffle=False)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(sess=sess, coord=coord)
for i in range(4):
x,y = tf.train.batch([a,b], batch_size=2)
x_,y_ =sess.run([x,y])
print(x_,y_)
coord.request_stop()
coord.join(threads)
另外,函数 tf.train.slice_input_producer 有效。当我忽略 tf.train.batch 时,代码变为:
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
import numpy as np
import tensorflow as tf
a = [[1,2,3,4],[1,2,3,4],[1,2,3,4],[1,2,3,4]]
b = [1,2,3,4]
input_queue = tf.train.slice_input_producer([a, b],num_epochs=None,shuffle=False)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(sess=sess, coord=coord)
for i in range(4):
print(sess.run(input_queue))
coord.request_stop()
coord.join(threads)
结果是:
[array([1, 2, 3, 4]), 1]
[array([1, 2, 3, 4]), 2]
[array([1, 2, 3, 4]), 3]
[array([1, 2, 3, 4]), 4]
【问题讨论】:
-
您想要的结果是什么?
tf.train.batch主要用于输入管道。
标签: python tensorflow tensor