【问题标题】:Maven could not resolve dependencies sparkMaven 无法解决依赖火花
【发布时间】:2014-12-19 08:23:34
【问题描述】:

我尝试构建一个简单的 java 程序:JavaWordCount for spark-1.1.0。

I get this error: Building JavaWordCount 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.279 s
[INFO] Finished at: 2014-10-23T11:28:30-04:00
[INFO] Final Memory: 9M/156M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project JavaWordCount: Could not resolve dependencies for     project spark.examples:JavaWordCount:jar:1.0-SNAPSHOT: Failure to find org.apache.spark:spark-assembly_2.10:jar:1.1.0 in http://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException

这是我在 pom.xml 中的依赖项

<dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
     </dependency>
    <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-assembly_2.10</artifactId>
    <version>1.1.0</version>
   </dependency>
   <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-examples_2.10</artifactId>
    <version>1.1.0</version>
    </dependency>
 <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.1.0</version>
 </dependency>

它包括火花组件。

任何想法都将不胜感激。

谢谢!

【问题讨论】:

    标签: maven hadoop apache-spark


    【解决方案1】:

    问题在于依赖:

    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-assembly_2.10</artifactId>
      <version>1.1.0</version>
    </dependency>
    

    不是罐子,它是pom file only which means,你不能这样定义它。您可以在错误消息中看到它:

    Failure to find org.apache.spark:spark-assembly_2.10:jar:1.1.0
    

    这表明 Maven 将尝试下载 jar 文件。这意味着您必须像这样定义它:

    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-assembly_2.10</artifactId>
      <version>1.1.0</version>
      <type>pom</type>
    </dependency>
    

    但我不确定这是否能解决所有问题。如果这是正确的路径,您应该深入查看文档。

    更新: 你也可以use that as BOM via:

    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-assembly_2.10</artifactId>
      <version>1.1.0</version>
      <type>pom</type>
      <scope>import</scope>
    </dependency>
    

    【讨论】:

      猜你喜欢
      • 2017-01-06
      • 2012-01-12
      • 1970-01-01
      • 2011-12-08
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多