【发布时间】:2021-08-31 13:36:23
【问题描述】:
所以我正在尝试使用 Apache Zeppelin,但它会出现以下错误
org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:836)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:744)
at org.apache.zeppelin.scheduler.Job.run(Job.java:172)
at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132)
at org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter
at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:122)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
... 8 more
Caused by: java.lang.Exception: This is not officially supported spark version: 3.1.1
You can set zeppelin.spark.enableSupportedVersionCheck to false if you really want to try this version of spark.
at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:112)
我不确定如何将 zeppelin.spark.enableSupportedVersionCheck 设置为 false。另外,我不确定我是否应该这样做,因为 Zeppelin 文档明确指出“请勿更改 - 仅限开发人员设置,不用于生产用途”
但如果不卸载我当前版本的 Apache spark 并用旧版本替换它,我不知道如何使这些兼容,我真的不想这样做。
感谢任何建议!感谢您的宝贵时间!
【问题讨论】:
-
打我你不能
标签: apache-spark apache-zeppelin