【问题标题】:RUN pip install -r requirements.txt does not install requirements in docker containerRUN pip install -r requirements.txt 不会在 docker 容器中安装要求
【发布时间】:2019-12-23 16:10:54
【问题描述】:

我是 django、docker 和 scrapy 的新手,我正在尝试运行一个也使用 scrapy 的 django 应用程序(我基本上创建了一个 django 应用程序,它也是一个 scrapy 应用程序,并尝试从 django 视图调用蜘蛛)。尽管在 requirements.txt 中指定了这个 scrapy 并从 Dockerfile 运行 pip,但在运行 python manage.py runserver 0.0.0.0:8000 之前,容器中没有安装依赖项,并且 django 应用程序在系统检查期间失败,导致 web 容器退出,因为以下异常:

 | Exception in thread django-main-thread:
web_1  | Traceback (most recent call last):
web_1  |   File "/usr/local/lib/python3.7/threading.py", line 926, in _bootstrap_inner
web_1  |     self.run()
web_1  |   File "/usr/local/lib/python3.7/threading.py", line 870, in run
web_1  |     self._target(*self._args, **self._kwargs)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/utils/autoreload.py", line 54, in wrapper
web_1  |     fn(*args, **kwargs)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/management/commands/runserver.py", line 117, in inner_run
web_1  |     self.check(display_num_errors=True)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/management/base.py", line 390, in check
web_1  |     include_deployment_checks=include_deployment_checks,
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/management/base.py", line 377, in _run_checks
web_1  |     return checks.run_checks(**kwargs)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/checks/registry.py", line 72, in run_checks
web_1  |     new_errors = check(app_configs=app_configs)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/checks/urls.py", line 40, in check_url_namespaces_unique
web_1  |     all_namespaces = _load_all_namespaces(resolver)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/checks/urls.py", line 57, in _load_all_namespaces
web_1  |     url_patterns = getattr(resolver, 'url_patterns', [])
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/utils/functional.py", line 80, in __get__
web_1  |     res = instance.__dict__[self.name] = self.func(instance)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/urls/resolvers.py", line 579, in url_patterns
web_1  |     patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/utils/functional.py", line 80, in __get__
web_1  |     res = instance.__dict__[self.name] = self.func(instance)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/urls/resolvers.py", line 572, in urlconf_module
web_1  |     return import_module(self.urlconf_name)
web_1  |   File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
web_1  |     return _bootstrap._gcd_import(name[level:], package, level)
web_1  |   File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
web_1  |   File "<frozen importlib._bootstrap>", line 983, in _find_and_load
web_1  |   File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
web_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
web_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
web_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
web_1  |   File "/code/composeexample/urls.py", line 21, in <module>
web_1  |     path('scrapy/', include('scrapy_app.urls')),
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/urls/conf.py", line 34, in include
web_1  |     urlconf_module = import_module(urlconf_module)
web_1  |   File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
web_1  |     return _bootstrap._gcd_import(name[level:], package, level)
web_1  |   File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
web_1  |   File "<frozen importlib._bootstrap>", line 983, in _find_and_load
web_1  |   File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
web_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
web_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
web_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
web_1  |   File "/code/scrapy_app/urls.py", line 4, in <module>
web_1  |     from scrapy_app import views
web_1  |   File "/code/scrapy_app/views.py", line 1, in <module>
web_1  |     from scrapy.crawler import CrawlerProcess
web_1  | ModuleNotFoundError: No module named 'scrapy'

我尝试使用pip3 代替 pip,pip install --no-cache-dir -r requirements.txt,更改 Dockerfile 中语句的顺序,并且我还检查了 Scrapy==1.7.3 是否出现在 requirements.txt 中。似乎没有任何效果。

这是我的 Dockerfile:

FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY requirements.txt /code/
RUN pip install -r requirements.txt
COPY . /code/

这是我的 docker-compose.yml:

version: '3'

services:
  db:
    image: postgres
  web:
    build: .
    command: python manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/code
    ports:
      - "8000:8000"
    depends_on:
      - db

【问题讨论】:

  • 请出示requirements.txt
  • ``` (...) chardet==3.0.4 不断==15.1.0 密码学==2.7 cssselect==1.1.0 Django==2.2.4 超链接==19.0.0 idna==2.8 增量==17.5.0 lxml==4.4.1 parsel==1.5.2 psycopg2==2.8.3 pyasn1==0.4.6 pyasn1-modules==0.2.6 pycparser==2.19 PyDispatcher==2.0 .5 PyHamcrest==1.9.0 pyOpenSSL==19.0.0 python-scrapyd-api==2.1.2 pytz==2019.2 queuelib==1.5.0 requests==2.22.0 Scrapy==1.7.3 scrapy-djangoitem= =1.1.1 scrapyd==1.2.1 service-identity==18.1.0 六==1.12.0 sqlparse==0.3.0 Twisted==19.7.0 urllib3==1.25.3 w3lib==1.21.0 zope。接口==4.6.0 ```
  • 能否请您出示一下代码?
  • 您使用哪个命令来启动容器?你确定它运行的镜像是最新的并且里面的需求文件有 Scrapy 吗?

标签: django docker pip docker-compose dockerfile


【解决方案1】:

您的requirements.txt 中似乎缺少scrapy

我尝试使用您的所有组件构建一个最小版本。希望对您有所帮助。

test.py

import scrapy
from time import sleep


def main():
    while True:
        print(scrapy)
        sleep(1)


if __name__ == "__main__":
    main()

requirements.txt

Scrapy==1.7.3

Dockerfile

FROM python:3

ENV PYTHONUNBUFFERED 1

WORKDIR /code

COPY requirements.txt .
RUN pip3 install -r requirements.txt

COPY . ./

CMD [ "python3", "test.py" ]

docker-compose.yml

version: '3'

services:
  db:
    image: postgres
  web:
    build: .
    ports:
      - "8000:8000"
    depends_on:
      - db

【讨论】:

  • 我还稍微重构了 docker 部分^^
  • 我确实有 Scrapy :( 我检查过
  • @Iuliana Voinea 你是怎么检查的?您可以通过docker exec -it web_1 /bin/bash 进入 docker 容器然后执行pip show 以检查 Scrapy 安装来再次检查吗?
  • 是的,我确实做到了,确实它没有安装在容器中,但这不是我的问题,我的问题是为什么它不执行 Dockerfile 中指定的pip install -r requirements.txt。我还检查了容器中 requirements.txt 的内容,它们是正确的(包含 Scrapy)。
【解决方案2】:

有点晚了,但我遇到了这个问题并最终弄明白了(我把它放在这里给有同样问题的其他人)。

我第一次尝试构建我的 docker 镜像时,我的requirements.txt 没有包含所需的模块。当然,我添加了所需的模块但似乎没有任何反应,这是因为我们需要从头开始重建容器,否则我们只是试图一次又一次地构建相同的版本。

要使用我们更新的文件重建容器,我们编写:

docker-compose rm -f
docker-compose pull
docker-compose up

如果这不起作用,请尝试相同的方法,但将最后一行替换为 docker-compose up --build -d

我从this 得到这个答案。

【讨论】:

    猜你喜欢
    • 2021-02-01
    • 1970-01-01
    • 2020-01-11
    • 1970-01-01
    • 2020-07-31
    • 1970-01-01
    • 1970-01-01
    • 2013-09-08
    • 1970-01-01
    相关资源
    最近更新 更多