【发布时间】:2016-05-18 21:16:33
【问题描述】:
我正在尝试使用 sbt 程序集构建一个胖罐子以发送到 spark-submit。但是,我似乎无法正确构建构建过程。
我目前的build.sbt如下
name := "MyAppName"
version := "1.0"
scalaVersion := "2.10.6"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.6.0" % "provided",
"org.apache.spark" %% "spark-mllib" % "1.6.0" % "provided",
"org.scalanlp" %% "breeze" % "0.12",
"org.scalanlp" %% "breeze-natives" % "0.12"
)
resolvers ++= Seq(
"Sonatype Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
)
运行 sbt-sssembly 会生成一个 jar。但是,在将 jar 提交到 spark-submit 之后
spark-submit MyAppName-assembly-1.0.jar(已经指定了一个主类,所以我假设它可以我不指定一个类),抛出以下异常:
java.lang.NoSuchMethodError: breeze.linalg.DenseVector.noOffsetOrStride()Z
at breeze.linalg.DenseVector$canDotD$.apply(DenseVector.scala:629)
at breeze.linalg.DenseVector$canDotD$.apply(DenseVector.scala:626)
at breeze.linalg.ImmutableNumericOps$class.dot(NumericOps.scala:98)
at breeze.linalg.DenseVector.dot(DenseVector.scala:50)
at RunMe$.cosSimilarity(RunMe.scala:103)
at RunMe$$anonfun$4.apply(RunMe.scala:35)
at RunMe$$anonfun$4.apply(RunMe.scala:33)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.convert.Wrappers$IteratorWrapper.next(Wrappers.scala:30)
at org.spark-project.guava.collect.Ordering.leastOf(Ordering.java:658)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$29.apply(RDD.scala:1377)
at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$29.apply(RDD.scala:1374)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
我对 scala 和 sbt 的世界还比较陌生,所以任何帮助都将不胜感激!
【问题讨论】:
-
可能被骗? stackoverflow.com/questions/28459333/… 看看那里的建议是否适合你
-
错误不完全相同,我的依赖项没有包含在我的 JAR 中,但提交给 spark 本身的 JAR 没有问题。
-
当你运行
assemblyPackageDependency时,你应该得到一个类似MyAppName-assembly-1.0-deps.jar的jar。这将包含您的部门。
标签: scala apache-spark sbt sbt-assembly