【问题标题】:Simple gradient descent optimizer in PyTorch not workingPyTorch 中的简单梯度下降优化器不起作用
【发布时间】:2021-12-18 02:00:23
【问题描述】:

我正在尝试在 PyTorch 中实现一个简单的最小化器,代码如下(vqv_trans 是张量,eta 是 0.01):

for i in range(10):
    print('i =', i, ' q =', q)
    v_trans = forward(v, q)
    loss = error(v_trans, v_target)
    q.requires_grad = True
    loss.backward()
    grads = q.grad
    with torch.no_grad()
        q = q - eta * grads

print('Final q = ', q)

在循环的第二次迭代中,“loss.backward()”行出现错误:

Traceback (most recent call last):
  File "C:\Scripts\main.py", line 97, in <module>
    loss.backward()
  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\_tensor.py", line 307, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs)
  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\autograd\__init__.py", line 154, in backward
    Variable._execution_engine.run_backward(
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

我已经尝试了几件事,但无法让这个简单的示例工作。是否有关于如何为不涉及神经网络的项目制作简单优化器的教程/指南/文档?或者,如何将 PyTorch 中内置的优化器用于非 NN 项目?

【问题讨论】:

  • vq 是否需要计算梯度?
  • @Ivan q 是要优化的张量,所以它确实需要梯度,而v 不需要。
  • 你能打印v_trans.grad_fnloss.grad_fn吗?
  • 需要在开头设置q.requires_grad_(True)

标签: python optimization pytorch gradient gradient-descent


【解决方案1】:

这是一个寻找函数零(或局部最小值)的简单示例(在本例中为 )。

# the loss function 
def mse(Y, target):
    diff = target - Y 
    return (diff * diff).sum() / 2
# this is the variable 
X = torch.rand(1, requires_grad=True) 
# this is our learning rate
lr = 1e-3 
# this is the learning loop
for i in range(0, 1000):
    # here the actual function 
    Y=X*X+3*X+1.0
    # we are looking for 0
    loss = mse(Y, 0)
    loss.backward()
    if i % 100 == 0:
        print("X", X.item(),"loss", loss.item(), "grad", X.grad.item())
    with torch.no_grad():
        X -= X.grad * lr
        X.grad.zero_()
print("result found (could be local minimum): ", X.item(), "/", Y.item())

它应该给你一个类似这样的输出:

X 0.45342570543289185 loss 3.2918500900268555 grad 10.024480819702148
X -0.048841409385204315 loss 0.3662492334842682 grad 2.483980894088745
X -0.21245644986629486 loss 0.08313754200935364 grad 1.0500390529632568
X -0.2880801260471344 loss 0.02392572909593582 grad 0.5302143096923828
X -0.3278542160987854 loss 0.007678795140236616 grad 0.2905181050300598
X -0.35011476278305054 loss 0.0026090284809470177 grad 0.16612648963928223
X -0.3629951477050781 loss 0.0009150659898295999 grad 0.09728223085403442
X -0.37058910727500916 loss 0.00032688589999452233 grad 0.0577557273209095
X -0.3751157820224762 loss 0.0001180334365926683 grad 0.0345664918422699
X -0.377831369638443 loss 4.289587013772689e-05 grad 0.020787909626960754
result found (could be local minimum):  -0.37946680188179016 / 0.00562286376953125

【讨论】:

    猜你喜欢
    • 2018-08-18
    • 2023-03-18
    • 1970-01-01
    • 1970-01-01
    • 2019-01-31
    • 2021-07-16
    • 1970-01-01
    • 1970-01-01
    • 2016-10-30
    相关资源
    最近更新 更多