训练GAN net时经常遇到这个问题

RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling .backward() or autograd.grad() the first time.

翻译一下就是 第二次尝试在图中向后遍历时,保存的临时变量已经被释放

显然,

GAN中有一个变量存在于gen和disc之间,就是fake

加上detach() 就行

报错RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed

 

相关文章:

  • 2022-03-04
  • 2022-12-23
  • 2022-12-23
  • 2021-11-01
  • 2021-05-18
  • 2021-07-23
  • 2022-02-26
猜你喜欢
  • 2022-12-23
  • 2022-12-23
  • 2021-05-20
  • 2022-12-23
  • 2021-05-06
  • 2022-12-23
  • 2022-12-23
相关资源
相似解决方案