【问题标题】:DockerOperator failure: Permission denied for docker.sockDockerOperator 失败:docker.sock 的权限被拒绝
【发布时间】:2021-05-12 12:09:27
【问题描述】:

我使用 Docker-compose 运行 Airflow,我想使用 DockerOperator 创建一个 DAG 以运行 docker 容器。

问题是我在 Airflow 中查看 DAG 的日志时不断收到相同的错误:

[2021-05-12 11:37:38,060] {taskinstance.py:1482} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 677, in urlopen
    chunked=chunked,
  File "/home/airflow/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 392, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/local/lib/python3.6/http/client.py", line 1287, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/local/lib/python3.6/http/client.py", line 1333, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/local/lib/python3.6/http/client.py", line 1282, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/local/lib/python3.6/http/client.py", line 1042, in _send_output
    self.send(msg)
  File "/usr/local/lib/python3.6/http/client.py", line 980, in send
    self.connect()
  File "/home/airflow/.local/lib/python3.6/site-packages/docker/transport/unixconn.py", line 43, in connect
    sock.connect(self.unix_socket)
PermissionError: [Errno 13] Permission denied

有争议的文件是 docker.stock,它具有以下权限:

srw-rw----. 1 root docker

请注意,在 Docker-compose yml 文件中,我将用户指定为默认用户,并将组指定为 root,另外我将 docker.sock 目录挂载如下:

- /var/run/docker.sock:/var/run/docker.sock:z

这里是 docker-compose:

version: '3'
x-airflow-common:
  &airflow-common
  image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.0.2}
  environment:
    &airflow-common-env
    AIRFLOW_UID: 8854
    AIRFLOW_GID: 0
    AIRFLOW__CORE__EXECUTOR: CeleryExecutor
    AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
    AIRFLOW__CORE__FERNET_KEY: ''
    AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
    AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
    AIRFLOW__CORE__DAGS_FOLDER: '/data/python/airflow/dags'
    AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
  volumes:
    - /data/graylog/python/airflow/dags:/data/python/airflow/dags:z
    - /data/graylog/python/airflow/logs:/data/python/airflow/logs:z
    - /data/graylog/python/airflow/plugins:/data/python/airflow/plugins:z
    - /data/graylog/python/timestamp:/data/python/timestamp:z
    - /data/graylog/python/python_scripts:/data/python/python_scripts:z
    - /var/run/docker.sock:/var/run/docker.sock:z
  user: "${AIRFLOW_UID:-8854}:${AIRFLOW_GID:-0}"
  depends_on:
    redis:
      condition: service_healthy
    postgres:
      condition: service_healthy

services:
  postgres:
    image: postgres:13
    environment:
      POSTGRES_USER: airflow
      POSTGRES_PASSWORD: airflow
      POSTGRES_DB: airflow
    volumes:
      - postgres-db-volume:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD", "pg_isready", "-U", "airflow"]
      interval: 5s
      retries: 5
    restart: always

  redis:
    image: redis:latest
    ports:
      - 6379
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 5s
      timeout: 30s
      retries: 50
    restart: always

  airflow-webserver:
    <<: *airflow-common
    command: webserver
    ports:
      - 8080:8080
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always

  airflow-scheduler:
    <<: *airflow-common
    command: scheduler
    restart: always

  airflow-worker:
    <<: *airflow-common
    command: celery worker
    restart: always

  airflow-init:
    <<: *airflow-common
    command: version
    environment:
      <<: *airflow-common-env
      _AIRFLOW_DB_UPGRADE: 'true'
      _AIRFLOW_WWW_USER_CREATE: 'true'
      _AIRFLOW_WWW_USER_USERNAME: ${_AIRFLOW_WWW_USER_USERNAME:-airflow}
      _AIRFLOW_WWW_USER_PASSWORD: ${_AIRFLOW_WWW_USER_PASSWORD:-airflow}

  flower:
    <<: *airflow-common
    command: celery flower
    ports:
      - 5555:5555
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always

volumes:
  postgres-db-volume:

有人可以帮我解决这个问题吗?谢谢

【问题讨论】:

  • 请发布您的 docker compose 文件。
  • 我刚刚添加了它
  • 你解决过这个问题吗?到目前为止,我发现的唯一解决方案是将 docker.sock rw 权限添加到所有用户或将运行容器的用户添加到 docker 组和组 0。令我惊讶的是,即使添加 privileged: true 到 celery-worker 容器也可以什么都没有。

标签: docker docker-compose airflow


【解决方案1】:

要解决此问题,您可以使用:

sudo chmod 666 /var/run/docker.sock

之后,您将获得 docker 的权限。 我希望这可以帮助您解决问题。

【讨论】:

  • 感谢 rassakra 的快速回答您是否认为有一种方法可以在不使用 sudo chmod 的情况下解决问题,例如在我的 Docker-compose yml 文件中添加 docker 组?
  • 这样做将使主机上的每个用户都能够以 root 身份运行命令,而无需任何密码验证或其他检查。
  • 在不更改 docker-compse 中的任何内容的情况下尝试使用 cmod。或将您的用户添加到 docker 组也可以解决问题
  • 我的默认用户,即Docker主机中的用户,在原主机的docker组中。但是,在 Docker 主机中,我的用户所在的唯一组是 root。如何让 Docker 主机中的默认用户成为 docker 组的一部分?请注意,我已经尝试在我的 docker-compose yml 文件中使用 group_add: 没有成功
  • sudo groupadd docker 之后 sudo usermod -aG docker $USER , service docker restart
猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 2020-01-03
  • 1970-01-01
  • 1970-01-01
  • 2015-03-25
  • 2014-11-04
  • 2012-03-27
  • 1970-01-01
相关资源
最近更新 更多