【发布时间】:2017-06-22 20:33:11
【问题描述】:
我在尝试运行 spark 提交评论时收到上述错误消息:
spark-submit --class "retail.DataValidator" --master local --executor-memory 2g --total-executor-cores 2 sample-spark-180417_2.11-1.0.jar /home/hduser/Downloads/inputfiles/ /home/hduser/output/
错误信息:
Exception in thread "main" java.lang.NoClassDefFoundError: com/typesafe/config/ConfigFactory
at retail.DataValidator$.main(DataValidator.scala:12)
at retail.DataValidator.main(DataValidator.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.typesafe.config.ConfigFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 11 more
build.sbt 文件:
name := "sample-spark-180417"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
libraryDependencies += "com.typesafe" % "config" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.1.0"
libraryDependencies += "org.apache.spark" % "spark-hive_2.11" % "2.1.0"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.42"
libraryDependencies += "org.scala-lang" % "scala-swing" % "2.10+"
我没有任何 maven 依赖项或 pom.xml 文件。
谢谢
【问题讨论】:
-
根据 sbt 文件,您已添加:
libraryDependencies += "com.typesafe" % "config" % "1.3.1"。 ------------------- 那个jar主要负责ClassDefNotFound异常。 -->>> 在构建 jar 时检查该 jar 是否被丢弃。 -
我使用“sbt package”命令从程序路径构建jar,如何查看是否添加到jar中,请指教
-
如果是fat-jar,解压你的jar
sample-spark-***.jar,看看它是否有com/typesafe/config/******这样的文件夹。 -
否则,在 spark-submit 命令中添加:
--jars /fullpath/first.jar以包含 jar"com.typesafe" % "config" % "1.3.1" -
我用这个命令查看了 jar 文件“jar tf jar-file”的内容,但是类型安全它不存在
标签: apache-spark