【问题标题】:SBT gives error when importing Spark's dependenciesSBT 在导入 Spark 的依赖项时出错
【发布时间】:2021-11-12 22:22:07
【问题描述】:

我是 Spark 的新手,这是我的第一个测试项目。我按照教程进行了一切工作,但是当我尝试将其引入我的机器时它不起作用。我在构建项目时遇到了错误。我正在使用依赖项:

name := "spark"

version := "0.1"

scalaVersion := "2.12.8"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.3",
  "org.apache.spark" %% "spark-sql" % "2.3.3"
)

导入依赖项时出现错误:

[error] sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.3.3: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.3.3: not found
[error]     at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:332)
[error]     at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:208)
[error]     at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:239)
[error]     at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error]     at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error]     at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error]     at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:113)
[error]     at xsbt.boot.Locks$GlobalLock.withChannelRetries$1(Locks.scala:91)
[error]     at xsbt.boot.Locks$GlobalLock.$anonfun$withFileLock$1(Locks.scala:119)
[error]     at xsbt.boot.Using$.withResource(Using.scala:12)
[error]     at xsbt.boot.Using$.apply(Using.scala:9)
[error]     at xsbt.boot.Locks$GlobalLock.withFileLock(Locks.scala:119)
[error]     at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:71)
[error]     at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:59)
[error]     at xsbt.boot.Locks$.apply0(Locks.scala:47)
[error]     at xsbt.boot.Locks$.apply(Locks.scala:36)
[error]     at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error]     at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error]     at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error]     at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:238)
[error]     at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error]     at 

...
...
...

[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.3.3: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.3.3: not found
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.3.3: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.3.3: not found
[error] Total time: 1 s, completed 18-Sep-2021 10:33:42
[info] shutting down server

【问题讨论】:

    标签: scala apache-spark intellij-idea apache-spark-sql sbt


    【解决方案1】:

    看来 spark 2.3 需要兼容的 scala 版本,尝试使用 2.11.x 作为 scala 版本。

    来源:[sparkDocs]https://spark.apache.org/docs/2.3.0/

    `Spark 在 Java 8+、Python 2.7+/3.4+ 和 R 3.1+ 上运行。对于 Scala API,Spark 2.3.0 使用 Scala 2.11。您需要使用兼容的 Scala 版本 (2.11.x)。

    请注意,自 Spark 2.2.0 起,对 Java 7、Python 2.6 和 2.6.5 之前的旧 Hadoop 版本的支持已被删除。自 2.3.0 起,对 Scala 2.10 的支持已被删除。`

    【讨论】:

      猜你喜欢
      • 2016-09-14
      • 2016-04-13
      • 2014-12-11
      • 2019-02-12
      • 1970-01-01
      • 1970-01-01
      • 2017-11-05
      • 2018-04-20
      • 2018-03-09
      相关资源
      最近更新 更多