【问题标题】:Fail to run DAG on Airflow 1.10.14 running with docker-compose on official Apache\Airflow image无法在 Airflow 1.10.14 上运行 DAG,在官方 Apache\Airflow 映像上使用 docker-compose 运行
【发布时间】:2020-12-24 21:09:06
【问题描述】:

我一直在尝试将 Airflow 1.10.14 设置为使用 docker-compose 在 docker 容器上执行基于 Python 的进程。主机是 Ubuntu 18 虚拟机。

我的 Dockerfile:

FROM apache/airflow:1.10.14
USER root
RUN pip install --upgrade pip
RUN pip install --user psycopg2-binary

COPY airflow.cfg /opt/airflow/

RUN apt-get update && apt-get install -y \
    libodbc1 \
    python3-dev\
    libevent-dev\
    unixodbc-dev \
    freetds-dev \
    freetds-bin -y \ 
    tdsodbc -y \
    build-essential

# install dependencies
ADD requirements.txt .
RUN pip install -r requirements.txt


USER airflow

然后我执行:

docker build -t learning/airflow .

我的 docker-compose.yml 是:

version: "3"
networks:
  airflow:

services:
  postgres:
    image: "postgres:9.6"
    container_name: "postgres"
    environment:
      - POSTGRES_USER=airflow
      - POSTGRES_PASSWORD=airflow
      - POSTGRES_DB=airflow
    ports:
      - "5432:5432"
    networks:
      - airflow

  # uncomment initdb if you need initdb at first run
  initdb:
    image: learning/airflow
    entrypoint: airflow db init
    depends_on:
      - postgres
    networks:
      - airflow

  webserver:
    image: learning/airflow
    restart: always
    depends_on:
      - postgres
    volumes:
      - ./dags:/opt/airflow/dags
    ports:
      - "8080:8080"
    entrypoint: airflow webserver
    healthcheck:
      test: ["CMD-SHELL", "[ -f /opt/airflow/airflow-webserver.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 3
    networks:
      - airflow

  scheduler:
    image: learning/airflow
    restart: always
    depends_on:
      - postgres
      - webserver
    volumes:
      - ./dags:/opt/airflow/dags
      - ./logs:/opt/airflow/logs
    entrypoint: airflow scheduler
    healthcheck:
      test: ["CMD-SHELL", "[ -f /opt/airflow/airflow-scheduler.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 3
    networks:
      - airflow

我还使用 here 中出现的气流.cfg(稍作改动) 在第一次运行中,我在单独的终端中执行 3 个步骤:

docker-compose up postgres
docker-compose up initdb
docker-compose up webserver scheduler

我能够访问 Airflow UI 并打开 DAG,但第一步立即失败并出现以下错误:

*** Log file does not exist: /opt/airflow/logs/stg_process/Process_g/2020-12-23T00:00:00+00:00/2.log
*** Fetching from: http://bf23abdeb4b0:8793/log/stg_process/Process_g/2020-12-23T00:00:00+00:00/2.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='bf23abdeb4b0', port=8793): Max retries
exceeded with url:
/log/stg_process/Process_g/2020-12-23T00:00:00+00:00/2.log
(Caused by NewConnectionError('<urllib3.connection.HTTPConnection
object at 0x7ff6f20ae898>: Failed to establish a new connection:
[Errno 111] Connection refused',))

我在这里缺少什么?任何帮助将不胜感激...

【问题讨论】:

    标签: docker docker-compose airflow


    【解决方案1】:

    我可能迟到了,但我认为问题出在 Dockerfile 的这一行:

    COPY airflow.cfg /opt/airflow/
    

    应该是这样的:

    COPY airflow.cfg /opt/airflow/airflow.cfg
    

    【讨论】:

      【解决方案2】:

      尝试重置元数据数据库然后重新构建?

      airflow resetdb
      

      【讨论】:

        猜你喜欢
        • 1970-01-01
        • 2021-04-08
        • 2018-07-01
        • 2018-10-26
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        相关资源
        最近更新 更多