【发布时间】:2019-09-25 05:06:57
【问题描述】:
我正在使用气流(docker)运行造纸厂命令。该脚本存储在 S3 上,我使用 papermill 的 Python 客户端运行它。它最终会导致一个完全无法理解的错误:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/ipython_genutils/ipstruct.py", line 132, in __getattr__
result = self[key]
KeyError: 'kernelspec'
我尝试查看文档但徒劳无功。
我使用的是运行 papermill 命令的代码是:
import time
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from mypackage.datastore import db
from mypackage.workflow.transform.jupyter_notebook import run_jupyter_notebook
dag_id = "jupyter-test-dag"
default_args = {
'owner': "aviral",
'depends_on_past': False,
'start_date': "2019-02-28T00:00:00",
'email': "aviral@some_org.com",
'email_on_failure': False,
'email_on_retry': False,
'retries': 0,
'retry_delay': timedelta(minutes=5),
'provide_context': True
}
dag = DAG(
dag_id,
catchup=False,
default_args=default_args,
schedule_interval=None,
max_active_runs=1
)
def print_context(ds, **kwargs):
print(kwargs)
print(ds)
return 'Whatever you return gets printed in the logs'
def run_python_jupyter(**kwargs):
run_jupyter_notebook(
script_location=kwargs["script_location"]
)
create_job_task = PythonOperator(
task_id="create_job",
python_callable=run_python_jupyter,
dag=dag,
op_kwargs={
"script_location": "s3://some_bucket/python3_file_write.ipynb"
}
)
globals()[dag_id] = dag
函数run_jupyter_notebook是:
def run_jupyter_notebook(**kwargs):
"""Runs Jupyter notebook"""
script_location = kwargs.get('script_location', '')
if not script_location:
raise ValueError(
"Script location was not provided."
)
pm.execute_notebook(script_location, script_location.split(
'.ipynb')[0] + "_output" + ".ipynb")
我希望代码运行没有任何错误,因为我也在本地运行它(不使用 s3 路径,使用本地文件系统路径)
【问题讨论】:
-
我认为这与内核名称写入ipynb文件有关。当您将 ipynb 文件保存到尝试执行 ipynb 文件时,您可能使用了不同的内核。
-
运气好吗?虽然不完全相同,但在 Airflow
jupyter_client.kernelspec.NoSuchKernel: No such kernel named python3中运行时我确实遇到了类似的错误@
标签: python python-3.x jupyter-notebook jupyter papermill