【问题标题】:Spark-operator on EKS Apache spark failed to create temp directoryEKS Apache spark 上的 Spark-operator 无法创建临时目录
【发布时间】:2021-10-06 09:55:03
【问题描述】:

我正在尝试使用 spark-operator 将简单的 spark-pi.yaml 部署到 AWS EKS。 我已经成功部署了 spark-operator

在此处参考部署 YAML spark-operator example

执行 helm install 时出现以下错误

Events:
  Type     Reason                            Age   From            Message
  ----     ------                            ----  ----            -------
  Normal   SparkApplicationAdded             8s    spark-operator  SparkApplication spark-pi was added, enqueuing it for submission
  Warning  SparkApplicationSubmissionFailed  5s    spark-operator  failed to submit SparkApplication spark-pi: failed to run spark-submit for SparkApplication spark-operator/spark-pi: WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark/jars/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" java.io.IOException: Failed to create a temp directory (under /tmp) after 10 attempts!
  at org.apache.spark.util.Utils$.createDirectory(Utils.scala:305)
  at org.apache.spark.util.Utils$.createTempDir(Utils.scala:325)
  at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343)
  at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我该如何解决这个问题?

【问题讨论】:

    标签: apache-spark kubernetes kubernetes-helm amazon-eks spark-operator


    【解决方案1】:

    这将很难调试,但根据我的经验,这里可能会发生一些事情 -

    1. 我看到您的执行程序没有它的服务帐户 定义。您可能需要明确定义
    2. 您的卷中可能没有足够的空间来创建 /tmp 目录。您可能需要仔细检查您的音量大小

    【讨论】:

      猜你喜欢
      • 2017-10-06
      • 1970-01-01
      • 2015-07-17
      • 1970-01-01
      • 1970-01-01
      • 2023-03-16
      • 1970-01-01
      • 2020-11-17
      • 1970-01-01
      相关资源
      最近更新 更多