【问题标题】:How to reference jar files after sbt publish-local如何在 sbt publish-local 之后引用 jar 文件
【发布时间】:2014-08-17 23:57:12
【问题描述】:

spark jar 已成功发布到本地存储库:

sbt publish-local

这是 spark-core 的摘录 - 看起来很健康:

[info] 将 spark-core_2.10 发布到 C:\Users\s80035683.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT\spark-core_2.10-1.1.0-SNAPSHOT-javadoc.jar [信息] 发布 spark-core_2.10 到 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\poms\spark-core_2.10.pom [信息] 发布 spark-core_2.10 到 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\jars\spark-core_2.10.jar [信息] 发布 spark-core_2.10 到 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\srcs\spark-core_2.10-sources.jar [信息] 发布 spark-core_2.10 到 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\docs\spark-core_2.10-javadoc.jar [信息] 常春藤发布到 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\ivys\ivy.xml

特别是:这是 .m2 中的一个文件:

C:\Users\s80035683\.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT>dir

 Directory of C:\Users\s80035683\.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT

06/26/2014  04:25 PM    <DIR>          .
06/26/2014  04:25 PM    <DIR>          ..
06/26/2014  04:25 PM         1,180,476 spark-core_2.10-1.1.0-SNAPSHOT-javadoc.jar
06/26/2014  04:24 PM           808,815 spark-core_2.10-1.1.0-SNAPSHOT-sources.jar
06/26/2014  02:27 PM         5,781,917 spark-core_2.10-1.1.0-SNAPSHOT.jar
06/26/2014  05:03 PM            13,436 spark-core_2.10-1.1.0-SNAPSHOT.pom

尝试使用客户端项目中的 jar 时出现问题。

这是来自客户端 build.sbt 的摘录:

val sparkVersion = "1.1.0-SNAPSHOT"
..
libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.10" % sparkVersion  % "compile->default"  withSources(),
  "org.apache.spark" % "spark-sql_2.10" % sparkVersion  % "compile->default"  withSources()

..

resolvers  ++= Seq(
  "Apache repo" at "https://repository.apache.org/content/repositories/releases",
  "Local Repo" at Path.userHome.asFile.toURI.toURL + "/.m2/repository",
  Resolver.mavenLocal
)

所以:我们有:

  • 一个不错的本地回购
  • 一个引用本地 repo 的 build.sbt

但是当我们这样做时:

sbt package

我们对刚刚发布的相同 spark 工件有未解决的依赖关系:

[info] Loading project definition from C:\apps\hspark\project
[info] Set current project to hspark (in build file:/C:/apps/hspark/)
[info] Updating {file:/C:/apps/hspark/}hspark...
[info] Resolving org.scala-lang#scala-library;2.10.4 ...
  [info] Resolving org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT ...
  [info] Resolving org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT ...
  [info] Resolving org.scala-lang#scala-compiler;2.10.4 ...
  [info] Resolving org.scala-lang#scala-reflect;2.10.4 ...
  [info] Resolving org.scala-lang#jline;2.10.4 ...
  [info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[warn]  :: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
unresolved dependency: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
        at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
        at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
..
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:744)
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[error] unresolved dependency: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile

[
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHO

更新根据@lpiepiora 的回答,似乎删除 compile-> 默认值确实(令人惊讶地)有所作为。这是迄今为止的证据。

(使用依赖图插件):

完成更新。 [信息] 默认值:hspark_2.10:0.1.0-SNAPSHOT [S] [信息]
+-org.apache.spark:spark-core_2.10:1.1.0-SNAPSHOT [S]

【问题讨论】:

    标签: scala sbt apache-spark


    【解决方案1】:

    尝试为您的依赖项删除映射compile-&gt;default。正如文档所说,无论如何它都是多余的:

    没有映射(没有“->”)的配置被映射到“默认”或 “编译”。 -> 仅在映射到不同的 配置比那些。

    因此声明你的依赖如下:

    libraryDependencies ++= Seq(
      "org.apache.spark" % "spark-core_2.10" % sparkVersion withSources(),
      "org.apache.spark" % "spark-sql_2.10" % sparkVersion  withSources()
    )
    

    他们应该解决。

    【讨论】:

    • 这似乎有效。我正在做更多的验证。期待很快接受。
    • 实际上这似乎不起作用。你测试过这个吗?
    • 是的,我有。在本地发布并有另一个项目使用它。我将再次对其进行测试并将其放在某个地方。所以也许你可以在你的配置上测试它。
    • "sparkVersion withSources()" 不工作。我不得不删除 withSources()
    • @javadba 它对我有用 - 使用 withSources() 时遇到什么错误?我正在使用 sbt 0.13.5。
    猜你喜欢
    • 1970-01-01
    • 2021-10-03
    • 2015-06-14
    • 1970-01-01
    • 2013-07-13
    • 2016-12-21
    • 2018-01-07
    • 2011-09-15
    • 1970-01-01
    相关资源
    最近更新 更多