【问题标题】:Error during maven build in Scala-Spark在 Scala-Spark 中构建 Maven 期间出错
【发布时间】:2015-11-30 03:34:42
【问题描述】:

我正在尝试使用 Eclipse 和 Maven 创建简单的 spark 应用程序。 maven 构建时出现以下错误
无法执行目标 net.alchim31.maven:scala-maven-plugin:3.2.0:compile (default) on project XXXX: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value : 1) -> [帮助 1]

我正在使用以下 POM.xml 进行 maven 构建

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>SparkScalaMM</groupId>
  <artifactId>MMSpark</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <name>${project.artifactId}</name>
  <description>My wonderfull scala app</description>
  <inceptionYear>2015</inceptionYear>
      <licenses>
        <license>
          <name>My License</name>
          <url>http://....</url>
          <distribution>repo</distribution>
        </license>
      </licenses>

  <properties>  
     <maven.compiler.source>1.6</maven.compiler.source>
     <maven.compiler.target>1.6</maven.compiler.target>
     <encoding>UTF-8</encoding>
     <scala.version>2.11.5</scala.version>      
    <scala.compat.version>2.11</scala.compat.version>   
  </properties>

 <dependencies>
    <dependency>
       <groupId>org.scala-lang</groupId>
       <artifactId>scala-library</artifactId>
       <version>${scala.version}</version>
    </dependency>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.11</version>
        <scope>test</scope>
   </dependency>
   <dependency>
       <groupId>org.specs2</groupId>
       <artifactId>specs2-core_${scala.compat.version}</artifactId>
       <version>2.4.16</version>
       <scope>test</scope>
   </dependency>
   <dependency>
      <groupId>org.scalatest</groupId>                                  
       <artifactId>
          scalatest_${scala.compat.version}
       </artifactId>
         <version>2.2.4</version>
          <scope>test</scope>
   </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.5.2</version>
    </dependency>
  </dependencies>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
            <plugins>
               <plugin>
                  <groupId>net.alchim31.maven</groupId>
                  <artifactId>scala-maven-plugin</artifactId>
                  <version>3.2.0</version>
                  <executions>
                    <execution>
                      <goals>
                        <goal>compile</goal>
                        <goal>testCompile</goal>
                     </goals>
                  <configuration>
                     <args>
                       <arg>-make:transitive</arg>
                       <arg>-dependencyfile</arg>   
                       <arg>  ${project.build.directory}/.scala_dependencies
                      </arg>
                     </args>
                     </configuration>                       
                   </execution>
                </executions>
           </plugin>

            <plugin>                                        
               <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
               <version>2.18.1</version>
             <configuration>
               <useFile>false</useFile>                                       
               <disableXmlReport>true</disableXmlReport>
               <includes>
                  <include>**/*Test.*</include>
                  <include>**/*Suite.*</include>
               </includes>
            </configuration>
          </plugin>
      </plugins>
   </build>
</project>

【问题讨论】:

  • 我会很感激这个问题的任何答案。
  • 我看到的一个可能的问题是您使用的是 scala 2.11,但 Spark 仅适用于 scala 2.10 版...无论如何我不知道这是否是唯一的问题...跨度>
  • 感谢@mark91 的回复。

标签: eclipse scala maven apache-spark


【解决方案1】:

请参考link

<execution>
    <id>scala-compile-first</id>
        <phase>process-resources</phase>
            <goals>
                <goal>add-source</goal>
                <goal>compile</goal>
            </goals>
</execution>

<execution>
    <id>scala-test-compile</id>
    <phase>process-test-resources</phase>
    <goals>
        <goal>testCompile</goal>
    </goals>
</execution>

我们必须先添加 process-resources 阶段,然后再添加 process-test-resources 那么只有 scala-maven-plugin 能够正确识别类路径资源。

【讨论】:

    【解决方案2】:

    使用我在下面提供的内容.. 并确保在项目构建路径中您使用的是 scala 库版本 2.11.x

    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
        <modelVersion>4.0.0</modelVersion>
        <groupId>com.spark-scala</groupId>
        <artifactId>spark-scala</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <name>${project.artifactId}</name>
        <description>Spark in Scala</description>
        <inceptionYear>2010</inceptionYear>
    
        <properties>
            <maven.compiler.source>1.8</maven.compiler.source>
            <maven.compiler.target>1.8</maven.compiler.target>
            <encoding>UTF-8</encoding>
            <scala.tools.version>2.10</scala.tools.version>
            <!-- Put the Scala version of the cluster -->
            <scala.version>2.10.4</scala.version>
        </properties>
    
        <!-- repository to add org.apache.spark -->
        <repositories>
            <repository>
                <id>cloudera-repo-releases</id>
                <url>https://repository.cloudera.com/artifactory/repo/</url>
            </repository>
        </repositories>
    
        <build>
            <sourceDirectory>src/main/scala</sourceDirectory>
            <testSourceDirectory>src/test/scala</testSourceDirectory>
            <plugins>
                <plugin>
                    <!-- see http://davidb.github.com/scala-maven-plugin -->
                    <groupId>net.alchim31.maven</groupId>
                    <artifactId>scala-maven-plugin</artifactId>
                    <version>3.2.1</version>
                </plugin>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-surefire-plugin</artifactId>
                    <version>2.13</version>
                    <configuration>
                        <useFile>false</useFile>
                        <disableXmlReport>true</disableXmlReport>
                        <includes>
                            <include>**/*Test.*</include>
                            <include>**/*Suite.*</include>
                        </includes>
                    </configuration>
                </plugin>
    
                <!-- "package" command plugin -->
                <plugin>
                    <artifactId>maven-assembly-plugin</artifactId>
                    <version>2.4.1</version>
                    <configuration>
                        <descriptorRefs>
                            <descriptorRef>jar-with-dependencies</descriptorRef>
                        </descriptorRefs>
                    </configuration>
                    <executions>
                        <execution>
                            <id>make-assembly</id>
                            <phase>package</phase>
                            <goals>
                                <goal>single</goal>
                            </goals>
                        </execution>
                    </executions>
                </plugin>
                <plugin>
                    <groupId>org.scala-tools</groupId>
                    <artifactId>maven-scala-plugin</artifactId>
                </plugin>
            </plugins>
        </build>
    
        <dependencies>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.11</artifactId>
                <version>1.2.1</version>
            </dependency>
        </dependencies>
    </project>
    

    【讨论】:

    • 感谢您的意见,但我相信我发布的链接可以解决问题。
    【解决方案3】:

    http://freecontent.manning.com/wp-content/uploads/how-to-start-developing-spark-applications-in-eclipse.pdf 包含在 Apache Spark 中开始编码的绝佳方式。它指定了我在 Maven 目录中添加的远程原型。现在我正在使用它来实现 spark 应用程序。

    【讨论】:

    • 我很高兴它有帮助:)
    【解决方案4】:

    首先关闭锌。参见文档here

    ./build/zinc-<version>/bin/zinc -shutdown
    

    【讨论】:

      猜你喜欢
      • 2018-08-20
      • 1970-01-01
      • 2018-09-08
      • 2017-10-23
      • 2021-01-29
      • 2018-09-20
      • 2016-09-19
      • 1970-01-01
      • 2015-06-09
      相关资源
      最近更新 更多