【问题标题】:Issues after Apache Airflow migration from 1.9.0 to 1.10.1Apache Airflow 从 1.9.0 迁移到 1.10.1 后的问题
【发布时间】:2018-11-22 18:01:10
【问题描述】:

我刚刚将我的 Airflow 安装从 1.9.0 升级到 1.10.1。

我使用 docker 安装和运行 Airflow。 所以我刚刚用这些行更新了我的 DockerFile:

ENV SLUGIFY_USES_TEXT_UNIDECODE 是

运行 pip install apache-airflow[crypto,celery,postgres,hive,jdbc]==1.10.1

然后运行 ​​docker build,然后使用新镜像 docker-compose。

到目前为止,一切都很好。

在airflow.cfg中,我添加了一行:

rbac = 真

因为我想创建具有特定角色的用户,并只允许他们访问他们的 DAG

docker 容器运行时没有错误,当我在 UI 中单击 DAG 名称或尝试启动 DAG 时发生错误:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1982, in wsgi_app
    response = self.full_dispatch_request()
  File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1614, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1517, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/usr/local/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
    raise value
  File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1612, in full_dispatch_request
    rv = self.dispatch_request()
  File "/usr/local/lib/python3.5/site-packages/flask/app.py", line 1598, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 69, in inner
    return self._run_view(f, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/flask_admin/base.py", line 368, in _run_view
    return fn(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/flask_login/utils.py", line 261, in decorated_view
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 372, in view_func
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/airflow/www/utils.py", line 278, in wrapper
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/airflow/www/views.py", line 1345, in tree
    session, start_date=min_date, end_date=base_date)
  File "/usr/local/lib/python3.5/site-packages/airflow/models.py", line 3753, in get_task_instances
    tis = tis.order_by(TI.execution_date).all()
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2703, in all
    return list(self)
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2855, in __iter__
    return self._execute_and_instances(context)
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/orm/query.py", line 2878, in _execute_and_instances
    result = conn.execute(querycontext.statement, self._params)
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 945, in execute
    return meth(self, multiparams, params)
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/sql/elements.py", line 263, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1053, in _execute_clauseelement
    compiled_sql, distilled_params
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1189, in _execute_context
    context)
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1402, in _handle_dbapi_exception
    exc_info
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
    reraise(type(exception), exception, tb=exc_tb, cause=cause)
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 186, in reraise
    raise value.with_traceback(tb)
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context
    context)
  File "/usr/local/lib/python3.5/site-packages/sqlalchemy/engine/default.py", line 470, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) column task_instance.executor_config does not exist
LINE 1: ...ued_dttm, task_instance.pid AS task_instance_pid, task_insta...

感谢您的帮助。

【问题讨论】:

  • 从 v1.10.1 升级到 v1.10.3 后来到这里,接受的答案有效。

标签: python docker airflow


【解决方案1】:

试试airflow upgradedb。这将在您的元数据库中创建缺失的列。

对于在升级到 Airflow >=2.0.0 时遇到相同问题的人 试试airflow db upgrade

【讨论】:

  • 感谢您的回答。但是当我启动这个命令时,我有另一个错误: sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) could not create unique index "job_pkey" DETAIL: Key (id)=(128) is duplicated。 [SQL: 'ALTER TABLE 作业 ALTER COLUMN start_date TYPE TIMESTAMP WITH TIME ZONE ']
  • 我在调用堆栈中看到这个新错误来自文件“/usr/local/lib/python3.5/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py”,第 110 行,在升级 op.alter_column(table_name='job', column_name='start_date', type_=sa.TIMESTAMP(timezone=True))
  • 当我使用 'select * from job where id = 128' 请求气流数据库时,我只有一行...
  • 我终于在数据库中删除了这一行(id = 128 的表作业),之后一切正常。很奇怪
猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 2019-05-11
  • 2011-06-03
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多