【问题标题】:Docker-compose connection refused from celery芹菜拒绝 Docker-compose 连接
【发布时间】:2017-04-12 06:10:47
【问题描述】:

我正在运行docker-compose 将 django、celery、postgres 和 rabbitmq 与以下 docker-compose.yml 结合在一起

version: '2'

services:
  # PostgreSQL database
  db:
    image: postgres:9.4
    hostname: db
    environment:
      - POSTGRES_USER=<XXX>
      - POSTGRES_PASSWORD=<XXX>
      - POSTGRES_DB=<XXX>
    ports:
      - "5431:5432"

  rabbit:
    hostname: rabbit
    image: rabbitmq:3-management
    environment:
      - RABBITMQ_DEFAULT_USER=<XXX>
      - RABBITMQ_DEFAULT_PASS=<XXX>
    ports:
      - "5672:5672" 
      - "15672:15672"

  # Django web server
  web:
     build:
         context: .
         dockerfile: Dockerfile
     hostname: web
     command: /srv/www/run_web.sh
     volumes:
          - .:/srv/www
     ports:
       - "8000:8000"
     links:
       - db
       - rabbit
     depends_on:
       - db

   # Celery worker
    worker:
       hostname: celery
       build:
           context: .
           dockerfile: Dockerfile
       command: /srv/www/run_celery.sh
       volumes:
           - .:/srv/www
       links:
          - db
          - rabbit
       depends_on:
          - rabbit

在其中一个 Django 视图中,我委托一个 celery 任务执行一些处理,然后尝试将结果发布到另一个 Web 服务:

#views.py
@csrf_exempt
def process_data(request):
    if request.method == 'POST':

        #
        #Processing to retrieve data here
        #

        delegate_celery_task.delay(data)
    return HttpResponse(status=200)

#tasks.py
@app.task
def delegate_celery_task(in_data):
    from extractorService.settings import MASTER_NODE
    import json
    import urllib

    #
    #Some processing on in_data here to give out_data
    # 

    data = {'data': out_data}
    params = json.dumps(data).encode('utf8')

    req = urllib.request.Request('http://%s/api/data/'%(MASTER_NODE), data=params,
              headers={'content-type': 'application/json'})

    urllib.request.urlopen(req)

现在MASTER_NODE 只是 localhost:8001 我正在运行其他 Web 服务的地方。当我在 docker 之外运行所有内容时,设置就会运行。尽管工作进程给出了启动 docker:

worker_1 | [2016-11-28 12:20:17,527: WARNING/PoolWorker-2] unable to cache TLDs in file /usr/local/lib/python3.5/site-packages/tldextract/.tld_set: [Errno 13] Permission denied: '/ usr/local/lib/python3.5/site-packages/tldextract/.tld_set'

然后在发布到 Django 视图时,celery worker 启动但在 urlopen 调用时出错:

worker_1 | Traceback (most recent call last): worker_1 | File "/usr/local/lib/python3.5/site-packages/celery/app/trace.py", line 368, in trace_task worker_1 | R = retval = fun(*args, **kwargs) worker_1 | File "/usr/local/lib/python3.5/site-packages/celery/app/trace.py", line 623, in protected_call worker_1 | return self.run(*args, **kwargs) worker_1 | File "/srv/extractor_django/extractorService/tasks.py", line 25, in extract_entities worker_1 | urllib.request.urlopen(req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 162, in urlopen worker_1 | return opener.open(url, data, timeout) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 465, in open worker_1 | response = self._open(req, data) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 483, in _open worker_1 | '_open', req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 443, in _call_chain worker_1 | result = func(*args) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 1268, in http_open worker_1 | return self.do_open(http.client.HTTPConnection, req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 1242, in do_open worker_1 | raise URLError(err) worker_1 | urllib.error.URLError:

settings.py 中的 celery 配置为:

RABBIT_HOSTNAME = os.environ.get('RABBIT_PORT_5672_TCP', 'rabbit')
if RABBIT_HOSTNAME.startswith('tcp://'):
    RABBIT_HOSTNAME = RABBIT_HOSTNAME.split('//')[1]

BROKER_URL = os.environ.get('BROKER_URL', '')
if not BROKER_URL:
    BROKER_URL = 'amqp://{user}:{password}@{hostname}'.format(
        user=os.environ.get('RABBIT_ENV_USER', '<XXX>'),
        password=os.environ.get('RABBIT_ENV_RABBITMQ_PASS', '<XXX>'),
        hostname=RABBIT_HOSTNAME)

BROKER_HEARTBEAT = '?heartbeat=30'
if not BROKER_URL.endswith(BROKER_HEARTBEAT):
BROKER_URL += BROKER_HEARTBEAT

BROKER_POOL_LIMIT = 1
BROKER_CONNECTION_TIMEOUT = 10

CELERY_DEFAULT_QUEUE = 'default'
CELERY_QUEUES = (
Queue('default', Exchange('default'), routing_key='default'),)

CELERY_ALWAYS_EAGER = False
CELERY_ACKS_LATE = True
CELERY_TASK_PUBLISH_RETRY = True
CELERY_DISABLE_RATE_LIMITS = False

CELERY_IGNORE_RESULT = True
CELERY_SEND_TASK_ERROR_EMAILS = False
CELERY_TASK_RESULT_EXPIRES = 600

CELERYD_HIJACK_ROOT_LOGGER = False
CELERYD_PREFETCH_MULTIPLIER = 1
CELERYD_MAX_TASKS_PER_CHILD = 1000

有人对如何解决这个问题有任何想法吗?

【问题讨论】:

  • 你的芹菜想在哪里发帖?它应该向你的 rabbit 容器而不是 localhost 发出请求。
  • 它正在尝试发布到在 docker 之外运行的外部站点。在开发中,这是我机器的 localhost 在端口 8001 上运行另一个 Web 服务,但在生产中是在 AWS 上运行的站点。
  • 我更关心你正在创建的 celery 任务的后端配置是什么?
  • Rabbitmq 用作队列,我不存储结果 - 我只是将其发布到另一个服务。我从 settings.py 中添加了 celery 配置
  • 我认为应该有可以使用的 ip,通常是 172.x.x.x,如 172.17.42.1,还要确保服务监听来自桥接设备的连接

标签: django docker celery docker-compose


【解决方案1】:

您没有提到 Celery 的版本,但从发布日期来看,我猜它是 v4。

由于将 Celery 从 v3.1 更新到 v4,我刚刚遇到了类似的问题,根据此 tutorial 需要将 settings.py 中的 BROKER_URL 更改为 CELERY_BROKER_URL

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 2021-10-12
    • 1970-01-01
    • 2016-03-08
    • 2017-08-05
    • 2020-09-08
    • 1970-01-01
    • 1970-01-01
    • 2019-05-05
    相关资源
    最近更新 更多