【问题标题】:SBT dependency for sparkSQLsparkSQL 的 SBT 依赖项
【发布时间】:2018-01-16 23:44:12
【问题描述】:

我开始学习 spark sql 我在 sbt 中使用以下依赖项。我收到错误

name := "sparkLearning"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "1.6.1"
val sqlVersion = "1.3.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" % "spark-sql" % sqlVersion
)

我收到一个错误。

Error:Error while importing SBT project:<br/>...<br/><pre>[info] Resolving com.thoughtworks.paranamer#paranamer;2.6 ...
[info] Resolving org.scala-sbt#completion;0.13.15 ...
[info] Resolving org.scala-sbt#control;0.13.15 ...
[info] Resolving org.scala-sbt#sbt;0.13.15 ...
[info] Resolving org.scala-sbt#run;0.13.15 ...
[info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-48dd0744422128446aee9ac31aa356ee203cc9f4 ...
[info] Resolving org.scala-sbt#test-interface;1.0 ...
[info] Resolving com.jcraft#jsch;0.1.50 ...
[info] Resolving org.scala-lang#scala-compiler;2.10.6 ...
[info] Resolving jline#jline;2.14.3 ...
[info] Resolving org.scala-sbt#compiler-ivy-integration;0.13.15 ...
[info] Resolving org.scala-sbt#incremental-compiler;0.13.15 ...
[info] Resolving org.scala-sbt#logic;0.13.15 ...
[info] Resolving org.scala-sbt#main-settings;0.13.15 ...
[trace] Stack trace suppressed: run 'last *:update' for the full output.
[trace] Stack trace suppressed: run 'last *:ssExtractDependencies' for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-sql;1.3.1: not found
[error] (*:ssExtractDependencies) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-sql;1.3.1: not found
[error] Total time: 15 s, completed 27-Jul-2017 15:29:52
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384M; support was removed in 8.0

请告诉我如何解决这个问题。

【问题讨论】:

    标签: scala sbt


    【解决方案1】:

    你的 sbt 文件的正确格式是

    name := "sparkLearning"
    
    version := "1.0"
    
    scalaVersion := "2.11.8"
    
    val sparkVersion = "1.6.1"
    
    libraryDependencies ++= Seq(
      "org.apache.spark" % "spark-core_2.10" % sparkVersion,
      "org.apache.spark" % "spark-sql_2.10" % sparkVersion
      )
    

    我建议您使用应该与 scala 2.11.8 兼容的最新 spark 版本

    name := "sparkLearning"
    
    version := "1.0"
    
    scalaVersion := "2.11.8"
    
    val sparkVersion = "2.2.0"
    
    libraryDependencies ++= Seq(
      "org.apache.spark" %% "spark-core" % sparkVersion,
      "org.apache.spark" %% "spark-sql" % sparkVersion
      )
    

    【讨论】:

    • 感谢您的帮助。
    • 感谢您的接受 :) 如果对您有帮助,您也可以投票
    • 如果你打算使用 spark-submit 来运行你的 spark 程序,你需要添加 Provided 到 Spark 依赖: val sparkVersion = "2.2.0" ... "org.apache.spark " %% "spark-core" % sparkVersion % 提供,"org.apache.spark" %% "spark-sql" % sparkVersion % 提供,...
    猜你喜欢
    • 2018-03-20
    • 2019-04-02
    • 2016-06-19
    • 2016-05-16
    • 1970-01-01
    • 2016-07-18
    • 2011-08-27
    • 2014-03-30
    • 1970-01-01
    相关资源
    最近更新 更多