【问题标题】:scrapyd Error on schedule new spiderscrapyd 按时出错新蜘蛛
【发布时间】:2014-12-18 13:45:46
【问题描述】:

我无法安排蜘蛛运行

部署似乎没问题:

Deploying to project "scraper" in http://localhost:6800/addversion.json
Server response (200):
{"status": "ok", "project": "scraper", "version": "1418909664", "spiders": 3}

我安排了一次新的蜘蛛运行:

curl http://localhost:6800/schedule.json -d project=scraper -d spider=spider


{"status": "ok", "jobid": "3f81a0e486bb11e49a6800163ed5ae93"}

但是在scrapyd上我得到了这个错误:

2014-12-18 14:39:12+0100 [-] Process started:  project='scraper' spider='spider' job='3f81a0e486bb11e49a6800163ed5ae93' pid=28565 log='/usr/scrapyd/logs/scraper/spider/3f81a0e486bb11e49a6800163ed5ae93.log' items='/usr/scrapyd/items/scraper/spider/3f81a0e486bb11e49a6800163ed5ae93.jl'
2014-12-18 14:39:13+0100 [Launcher,28565/stderr] Traceback (most recent call last):
      File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
        "__main__", fname, loader, pkg_name)
      File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
        exec code in run_globals
      File "/usr/local/lib/python2.7/dist-packages/scrapyd/runner.py", line 39, in <module>
2014-12-18 14:39:13+0100 [Launcher,28565/stderr]     main()
      File "/usr/local/lib/python2.7/dist-packages/scrapyd/runner.py", line 36, in main
        execute()
      File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 143, in execute
        _run_print_help(parser, _run_command, cmd, args, opts)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 89, in _run_print_help
        func(*a, **kw)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 150, in _run_command
        cmd.run(args, opts)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 58, in run
        spider = crawler.spiders.create(spname, **opts.spargs)
2014-12-18 14:39:13+0100 [Launcher,28565/stderr]   File "/usr/local/lib/python2.7/dist-packages/scrapy/spidermanager.py", line 48, in create
        return spcls(**spider_kwargs)
      File "build/bdist.linux-x86_64/egg/scraper/spiders/spider.py", line 104, in __init__
      File "/usr/lib/python2.7/os.py", line 157, in makedirs
        mkdir(name, mode)
    OSError: [Errno 20] Not a directory: '/tmp/scraper-1418909944-dKTRZI.egg/logs/'
2014-12-18 14:39:14+0100 [-] Process died: exitstatus=1  project='scraper' 

有什么想法吗? :(

【问题讨论】:

  • 这是什么意思:OSError: [Errno 20] Not a directory: '/tmp/scraper-1418909944-dKTRZI.egg/logs/' ?
  • 嗯,它似乎无法访问该位置。可能是访问权限有问题?
  • drwxrwxrwx 3 root root 4096 Dec 18 16:16 tmp
  • 它似乎无法在该位置创建日志目录,还是什么?为什么?
  • 回溯指向scraper/spiders/spider.py 中的第 104 行——该行的代码是什么?

标签: scrapy scrapyd


【解决方案1】:

您正在尝试在 egg 中创建目录。

OSError: [Errno 20] Not a directory: '/tmp/scraper-1418909944-dKTRZI ----&gt;.egg&lt;----- /logs/'

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    相关资源
    最近更新 更多