【发布时间】:2019-08-29 20:15:55
【问题描述】:
我试图用 pytorch 复制一个用 tensorflow 编写的代码。我在 tensorflow 中遇到了一个损失函数,softmax_cross_entropy_with_logits。我在 pytorch 中寻找它的等价物,我发现了 torch.nn.MultiLabelSoftMarginLoss,虽然我不太确定它是正确的函数。我也不知道如何测量准确性当我使用这个损失函数并且网络末端没有 relu 层时,我的模型是我的代码:
# GRADED FUNCTION: compute_cost
def compute_cost(Z3, Y):
loss = torch.nn.MultiLabelSoftMarginLoss()
return loss(Z3,Y)
def model(net,X_train, y_train, X_test, y_test, learning_rate = 0.009,
num_epochs = 100, minibatch_size = 64, print_cost = True):
optimizer = torch.optim.Adam(net.parameters(), lr=learning_rate)
optimizer.zero_grad()
total_train_acc=0
for epoch in range(num_epochs):
for i, data in enumerate(train_loader, 0):
running_loss = 0.0
inputs, labels = data
inputs, labels = Variable(inputs), Variable(labels)
Z3 = net(inputs)
# Cost function
cost = compute_cost(Z3, labels)
# Backpropagation: Define the optimizer.
# Use an AdamOptimizer that minimizes the cost.
cost.backward()
optimizer.step()
running_loss += cost.item()
# Measuring the accuracy of minibatch
acc = (labels==Z3).sum()
total_train_acc += acc.item()
#Print every 10th batch of an epoch
if epoch%1 == 0:
print("Cost after epoch {} :
{:.3f}".format(epoch,running_loss/len(train_loader)))
【问题讨论】:
标签: tensorflow deep-learning pytorch