【发布时间】:2018-05-18 15:18:04
【问题描述】:
我已将逻辑回归模型定义为具有一层的网络,然后使用 tf.layers 进行 sigmoid 激活。预测结果是形状为(n,1)、n 的二维张量,即批次的大小。但是,为了计算损失函数,我需要一个形状为n 的一维张量。当然,我可以重塑张量(这就是我目前所做的),但不知何故,感觉应该做一些更优雅的事情。有吗?
重现问题的代码
import tensorflow as tf
import numpy as np
data = np.random.random((20, 6))
data[:, -1] = data[:, -1] > 0.5
e = tf.data.Dataset.from_tensor_slices(data).batch(2).make_one_shot_iterator().get_next()
x, y_ = e[:, :-1], e[:, -1]
y = tf.layers.dense(x, 1, activation=tf.nn.sigmoid)
loss_wrong = - tf.reduce_mean(tf.add(tf.multiply(y_, tf.log(y)), tf.multiply((1. - y_), tf.log(1. - y))))
y2 = tf.reshape(y, [-1]) # ugly reshape I would like to get rid of
loss_correct = - tf.reduce_mean(tf.add(tf.multiply(y_, tf.log(y2)), tf.multiply((1. - y_), tf.log(1. - y2))))
with tf.Session() as sess:
tf.global_variables_initializer().run()
a, b, b2, lw, lc = sess.run([y_, y, y2, loss_wrong, loss_correct])
print(a) # a 1D array with the real labels
print(b) # a dangerous array of arrays!
print(b2) # notice how nice and flat this is
print(lw) # wrong when all labels are not the same
print(lc) # correct (can test by hand with the data printed)
【问题讨论】:
标签: python tensorflow