【问题标题】:sbt unresolved dependency for spark-cassandra-connector 2.0.2spark-cassandra-connector 2.0.2 的 sbt 未解决依赖关系
【发布时间】:2017-11-11 16:58:52
【问题描述】:

build.sbt:

val sparkVersion = "2.1.1";

libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;

输出:

[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found

有什么想法吗?我是 sbt 和 spark 的新手。谢谢

【问题讨论】:

    标签: scala apache-spark sbt spark-cassandra-connector


    【解决方案1】:

    这是由 "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2"; 没有 scala 版本引起的,请参阅 ma​​ven repo

    http://search.maven.org/#artifactdetails%7Ccom.datastax.spark%7Cspark-cassandra-connector_2.11%7C2.0.2%7Cjar

    有两种解决方案:

    1. "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2" 显式设置 Scala 版本 为依赖项
    2. "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2",使用带有 artifact id%%,这样,SBT 将自动基于您项目的 scala 版本 扩展到解决方案 1

    【讨论】:

    • 谢谢。我现在把它改成了libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2"; libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion; ,错误是`冲突的跨版本后缀在:org.apache.spark:spark-tags`有什么想法吗?
    • @BAE 你尝试过 Chegpohi 建议的第一种方法吗?
    猜你喜欢
    • 2017-03-15
    • 1970-01-01
    • 2017-08-26
    • 2013-10-27
    • 1970-01-01
    • 2014-12-18
    • 2012-02-02
    • 2017-10-29
    • 2017-06-15
    相关资源
    最近更新 更多