【问题标题】:Binary output with tflearntflearn 的二进制输出
【发布时间】:2017-12-21 07:30:02
【问题描述】:

我是 tflearn/tensorflow 的初学者。 我正在制作一个 DNN 以将心跳分类为 Normal (0)心律失常 (1)。我的数据集是ECG by MIT Arrhytmia Dataset ..

我建立了以下网络:

## Build neural network
net = tflearn.input_data(shape=[None, 200])
net = tflearn.fully_connected(net, 400)
net = tflearn.fully_connected(net, 400)
net = tflearn.fully_connected(net, 100)
net = tflearn.fully_connected(net, 50)
net = tflearn.fully_connected(net, 1, activation='linear')
net = tflearn.regression(net, optimizer='adam', learning_rate=0.01,loss='mean_square', name='target')
##
## Define model
model = tflearn.DNN(net)
print('fiting')
model.fit(TrainSet, labelSet, n_epoch=5000, batch_size=len(TrainSet), show_metric=True)

我的输出是:

y pred                   y target
predict:[0.01360663]      [0]
predict:[0.00861748]      [0]
predict:[-0.00685573]     [0]
predict:[-0.20846206]     [1]

我的网络分离良好,正常样本 (predict:[0.00861748] [0]) 的输出与心律失常样本 (predict:[-0.20846206] [1]) 的输出完全不同

那么,我怎样才能使输出只有01 否则这个浮点值? 也许改变我的激活功能?或者用我的实际输出做点什么?

这是我的输出:model.predict(TrainingSet)

y predicted             y target
predict:[-0.138634]     [0]
predict:[-0.13436639]     [0]
predict:[-0.12879151]     [0]
predict:[-0.12057236]     [0]
predict:[-0.13836551]     [0]
predict:[-0.08525576]     [0]
predict:[ 1.01741135]     [1]
predict:[-0.11624834]     [0]
predict:[-0.12631142]     [0]
predict:[-0.11693959]     [0]
predict:[-0.10779606]     [0]
predict:[-0.11510199]     [0]
predict:[-0.12450527]     [0]
predict:[-0.12869376]     [0]
predict:[-0.15167347]     [0]
predict:[-0.14081171]     [0]
predict:[-0.14235598]     [0]
predict:[-0.13095573]     [0]
predict:[-0.12757528]     [0]
predict:[-0.14675851]     [0]
predict:[-0.12311366]     [0]
predict:[-0.15386838]     [0]
predict:[-0.17505151]     [0]
predict:[-0.13848163]     [0]
predict:[-0.11671469]     [0]
predict:[-0.13247125]     [0]
predict:[-0.13718334]     [0]
predict:[-0.12702732]     [0]
predict:[-0.12665084]     [0]
predict:[-0.1367469]     [0]
predict:[-0.15925398]     [0]
predict:[-0.13639028]     [0]
predict:[-0.11569472]     [0]
predict:[-0.14167]     [0]
predict:[-0.12262306]     [0]
predict:[-0.10863069]     [0]
predict:[-0.14324963]     [0]
predict:[-0.14792402]     [0]
predict:[-0.14929616]     [0]
predict:[-0.15551159]     [0]
predict:[-0.11816701]     [0]
predict:[-0.11785387]     [0]
predict:[-0.15215725]     [0]
predict:[-0.11279716]     [0]
predict:[-0.1469961]     [0]
predict:[-0.14991215]     [0]
predict:[-0.11661309]     [0]
predict:[-0.09011015]     [0]
predict:[-0.09775476]     [0]
predict:[-0.1065342]     [0]
predict:[-0.11091903]     [0]
predict:[-0.10344772]     [0]
predict:[-0.12412915]     [0]
predict:[-0.13605709]     [0]
predict:[-0.12797417]     [0]
predict:[-0.1076207]     [0]
predict:[-0.12150024]     [0]
predict:[-0.13840012]     [0]
predict:[-0.13084875]     [0]
predict:[-0.11066008]     [0]
predict:[-0.12374203]     [0]
predict:[-0.13341869]     [0]
predict:[-0.12912038]     [0]
predict:[-0.13748281]     [0]
predict:[-0.13966258]     [0]
predict:[-0.13894111]     [0]
predict:[-0.10213074]     [0]
predict:[-0.15602994]     [0]
predict:[-0.12982219]     [0]
predict:[-0.09376201]     [0]
predict:[-0.08830833]     [0]
predict:[-0.12029025]     [0]
predict:[-0.09362413]     [0]
predict:[ 1.09521723]     [1]
predict:[-0.13147078]     [0]
predict:[-0.1182971]     [0]
predict:[-0.12983324]     [0]
predict:[-0.18321729]     [0]
predict:[-0.18334746]     [0]
predict:[-0.2399022]     [0]

【问题讨论】:

  • 我不熟悉数据集,但在我看来,分离并不是那么好......我原以为输出中的第三个结果是[1],因为它是负面的。但是,仅从您提供的 4 个示例中不可能发现任何模式。
  • 您可以将激活更改为 sigmoid。
  • 您似乎在滥用网络的激活功能。奇怪的是,您使用 MSE 作为损失函数,而网络在最后一层具有线性激活。不幸的是,在 SO 中解释这个问题是题外话。考虑对神经网络进行进一步研究(包括关于典型的激活函数和损失函数),并最终调整您的问题并将其移至Cross Validated
  • 感谢回复,我改网为:net = flearn.fully_connected(net, 1, activation='sigmoid')net = tflearn.regression(net,optimizer='adam',loss='binary_crossentropy')
  • 这也是不合适的:二元交叉熵期望一个 logit 作为网络的输入,而你给它一个具有 sigmoid 激活的神经元。您可能会发现使用“binary_crossentropy”对最后一个神经元使用“线性”效果更好。另请参阅其他 Stack Exchange 站点的这些问题:stats.stackexchange.com/q/260505/67965datascience.stackexchange.com/questions/9302/…

标签: python tensorflow neural-network deep-learning tflearn


【解决方案1】:

一种方法是将激活函数更改为“softmax”并对预测进行四舍五入。你可以这样做;

net = tflearn.fully_connected(net, 1, activation='sigmoid ')

并预测:

pred = model.predict(test_data)     
print([ np.where(r==1)[0][0] for r in np.round(pred)])

应该可以的。

【讨论】:

  • 单个神经元上的 softmax 激活没有意义。您的意思可能是tflearn.fully_connected(net, 2, activation='softmax'),这仅适用于 2 个专属课程。
猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 2017-11-29
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 2018-04-09
相关资源
最近更新 更多