【问题标题】:Spark failed to delete temp directorySpark 删除临时目录失败
【发布时间】:2023-03-16 15:27:01
【问题描述】:

我正在尝试使用以下命令从 Windows 10 中的 cmd 提交 spark 程序:

spark-submit --class abc.Main --master local[2] C:\Users\arpitbh\Desktop\AmdocsIDE\workspace\Line_Count_Spark\target\Line_Count_Spark-0.0.1-SNAPSHOT.jar

但运行后我收到错误:

17/05/02 11:56:57 INFO ShutdownHookManager: Deleting directory C:\Users\arpitbh\AppData\Local\Temp\spark-03f14dbe-1802-40ca-906c-af8de0f462f9
17/05/02 11:56:57 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\arpitbh\AppData\Local\Temp\spark-03f14dbe-1802-40ca-906c-af8de0f462f9
java.io.IOException: Failed to delete: C:\Users\arpitbh\AppData\Local\Temp\spark-03f14dbe-1802-40ca-906c-af8de0f462f9
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1010)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1951)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at scala.util.Try$.apply(Try.scala:192)
        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
        at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

我还检查了apache spark的JIRA,这个缺陷已经被标记为已解决但没有提到解决方案。请帮忙。

package abc;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;


public class Main {

    /**
     * @param args
     */
    public static void main(String[] args) {
        // TODO Auto-generated method stub

        SparkConf conf =new SparkConf().setAppName("Line_Count").setMaster("local[2]");
        JavaSparkContext ctx= new JavaSparkContext(conf);

        JavaRDD<String> textLoadRDD = ctx.textFile("C:/spark/README.md");
        System.out.println(textLoadRDD.count());
        System.getProperty("java.io.tmpdir");

    }

}

【问题讨论】:

  • 欢迎来到 StackOverflow。请看这里how to format code
  • 你能提供你的代码吗?
  • 我以正确的格式更新了我的代码。请检查
  • 我只是通过退出 spark-shell 或运行任何示例遇到了同样的问题。这不是权限问题,因为我还尝试使用 --conf spark.local.dir 指定不同的工作目录。如果有人有解决方案,请分享。

标签: java apache-spark bigdata


【解决方案1】:

这可能是因为您在实例化 SparkContext 时没有 SPARK_HOME 或 HUPA_HOME 允许程序在 bin 目录中找到 winutils.exe。我发现当我从

SparkConf conf = new SparkConf();
JavaSparkContext sc = new JavaSparkContext(conf);

JavaSparkContext sc = new JavaSparkContext("local[*], "programname", 
System.getenv("SPARK_HOME"), System.getenv("JARS"));

错误消失了。

【讨论】:

    【解决方案2】:

    我相信,您正在尝试在未设置用户变量 HADOOP_HOME 或 SPARK_LOCAL_DIRS 的情况下执行程序。 我有同样的问题,通过创建变量来解决它,例如 HADOOP_HOME-> C:\Hadoop, SPARK_LOCAL_DIRS->C:\tmp\spark

    【讨论】:

    • 我已经设置了两个变量,但仍然得到相同的错误
    猜你喜欢
    • 2015-07-17
    • 1970-01-01
    • 2017-06-09
    • 1970-01-01
    • 1970-01-01
    • 2018-11-05
    • 1970-01-01
    • 1970-01-01
    • 2015-11-29
    相关资源
    最近更新 更多