【发布时间】:2018-07-11 05:51:53
【问题描述】:
所以我为 CNN 编写了一个代码,图片尺寸为 (350, 1, 1000, 1),采用 NHWC 格式。现在数据很完美,网络正在运行,但输出很奇怪。这是我的代码(可能没有语法错误,但有一些逻辑错误):
weights_conv1 = tf.get_variable(name = 'wc1', dtype = tf.float32, initializer = tf.random_normal(shape = (1, 3, 1, 20), mean = 0, stddev = 1.0))
weights_conv2 = tf.get_variable(name = 'wc2', dtype = tf.float32, initializer = tf.random_normal(shape = (1, 3, 20, 20), mean = 0, stddev = 1.0))
weights_conv3 = tf.get_variable(name = 'wc3', dtype = tf.float32, initializer = tf.random_normal(shape = (1, 3, 20, 20), mean = 0, stddev = 1.0))
weights_conv4 = tf.get_variable(name = 'wc4', dtype = tf.float32, initializer = tf.random_normal(shape = (1, 3, 20, 20), mean = 0, stddev = 1.0))
weights_conv5 = tf.get_variable(name = 'wc5', dtype = tf.float32, initializer = tf.random_normal(shape = (1, 3, 20, 10), mean = 0, stddev = 1.0))
filters = [weights_conv1] + [weights_conv2] + [weights_conv3] + [weights_conv4] + [weights_conv5]
bias1 = tf.get_variable(name = 'b1', dtype = tf.float32, initializer = tf.random_normal(mean = 0, stddev = 1.0, shape = (1, 1, 1, 20)))
bias2 = tf.get_variable(name = 'b2', dtype = tf.float32, initializer = tf.random_normal(mean = 0, stddev = 1.0, shape = (1, 1, 1, 20)))
bias3 = tf.get_variable(name = 'b3', dtype = tf.float32, initializer = tf.random_normal(mean = 0, stddev = 1.0, shape = (1, 1, 1, 20)))
bias4 = tf.get_variable(name = 'b4', dtype = tf.float32, initializer = tf.random_normal(mean = 0, stddev = 1.0, shape = (1, 1, 1, 20)))
bias5 = tf.get_variable(name = 'b5', dtype = tf.float32, initializer = tf.random_normal(mean = 0, stddev = 1.0, shape = (1, 1, 1, 10)))
biases = [bias1] + [bias2] + [bias3] + [bias4] + [bias5]
def convolutionForwardPropagation(img, filters, biases):
c1 = tf.nn.conv2d(img, filters[0], strides =[1,1,1,1], data_format ='NHWC', padding = 'VALID')
f1 = tf.nn.relu(c1) + biases[0]
c2 = tf.nn.conv2d(f1, filters[1], strides =[1,1,1,1], data_format ='NHWC', padding = 'VALID')
f2 = tf.nn.relu(c2) + biases[1]
c3 = tf.nn.conv2d(f2, filters[2], strides =[1,1,1,1], data_format ='NHWC', padding = 'VALID')
f3 = tf.nn.relu(c3) + biases[2]
c4 = tf.nn.conv2d(f3, filters[3], strides =[1,1,1,1], data_format ='NHWC', padding = 'VALID')
f4 = tf.nn.relu(c4) + biases[3]
c5 = tf.nn.conv2d(f4, filters[4], strides =[1,1,1,1], data_format ='NHWC', padding = 'VALID')
f5 = tf.nn.relu(c5) + biases[4]
shape = f5.shape
fr = tf.reshape(f5,(shape[0], shape[3] * shape[2]))
fc1 = tf.contrib.layers.fully_connected(fr, 1000, activation_fn = tf.nn.relu)
fc2 = tf.contrib.layers.fully_connected(fc1, 2, activation_fn = tf.nn.relu)
print(fc2.shape)
return fc2
fc = convolutionForwardPropagation(trainDataset, filters, biases)
entropy = tf.nn.softmax_cross_entropy_with_logits_v2(logits = fc, labels = labelsTrain, name = 'cross_entropy')
loss = tf.reduce_mean(entropy, name = 'loss')
optimizer = tf.train.AdamOptimizer(learning_rate).minimize(loss)
hypothesis = tf.nn.softmax(fc)
correct_preds = tf.equal(tf.argmax(hypothesis, 1), tf.argmax(labelsTrain, 1))
accuracy = tf.reduce_sum(tf.cast(correct_preds, tf.float32))
writer = tf.summary.FileWriter('./graphs/logreg', tf.get_default_graph())
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
for i in range(0, epochs):
sess.run(fc)
_, l = sess.run([optimizer, loss])
sess.run(hypothesis)
sess.run(correct_preds)
acc = sess.run(accuracy)
print("Epoch :", i+1, ", loss : ", l, ", accuracy :", acc)
writer.close()
结果有点奇怪:
(350, 2)
Epoch : 1 , loss : 599.7743 , accuracy : 175.0
Epoch : 2 , loss : 64824.633 , accuracy : 175.0
Epoch : 3 , loss : 15540.435 , accuracy : 175.0
Epoch : 4 , loss : 0.69314754 , accuracy : 175.0
Epoch : 5 , loss : 0.69314754 , accuracy : 175.0
Epoch : 6 , loss : 0.69314754 , accuracy : 175.0
Epoch : 7 , loss : 0.69314754 , accuracy : 175.0
Epoch : 8 , loss : 0.69314754 , accuracy : 175.0
.....
那么我的代码到底出了什么问题?
【问题讨论】:
-
出于好奇:您将学习率设置为什么值?
-
您还想解释一下“正确_preds = tf.equal(tf.argmax(hypothesis, 1), tf.argmax(labelsTrain, 1))”究竟是什么假设 i> 实现?我假设这是你的错误所在......
-
@dennlinger 预测正确吗? lr 也是 0.001
标签: python python-3.x tensorflow