今天使用scrapy爬虫的时候,运行时报了下面的错:

Traceback (most recent call last):
  File "d:\python\lib\site-packages\scrapy\core\downloader\middleware.py", line 43, in process_request
    defer.returnValue((yield download_func(request=request,spider=spider)))
twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionDone: Connection was closed cleanly.>]

scrapy爬虫错误笔记------twisted.python.failure.Failure twisted.internet.error.ConnectionDone: Connection

原因是没有设置请求头,所以在middlewares.py中修改了USER_AGENT的请求头。然后在我运行问题解决。

相关文章:

  • 2021-07-03
  • 2021-09-03
  • 2021-09-29
  • 2021-12-29
  • 2021-09-23
  • 2021-10-04
  • 2021-07-19
  • 2022-01-16
猜你喜欢
  • 2021-06-08
  • 2022-12-23
  • 2022-12-23
  • 2021-11-23
  • 2021-12-28
  • 2021-06-26
  • 2022-01-20
相关资源
相似解决方案