【问题标题】:An error occurs when using spark to retrieve elasticsearch data使用spark检索elasticsearch数据时出错
【发布时间】:2020-02-25 11:29:11
【问题描述】:

我想使用 spark 从弹性搜索数据目录中检索一些数据,我使用官方文档的方法然后这里出错了......

这是我的代码(使用 Java 和 JDK 1.8_221):

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.elasticsearch.spark.rdd.api.java.JavaEsSpark;
import scala.Tuple2;
import java.util.Map;

public class Main {
    public static void main(String[] args) {
        SparkConf conf = new SparkConf();
        conf.setMaster("local");
        conf.setAppName("Spark ElasticSearch");

        conf.set("es.index.auto.create", "true");
        conf.set("es.nodes", "10.245.142.213");
        conf.set("es.port", "9200");

        JavaSparkContext sc = new JavaSparkContext(conf);
        sc.setLogLevel("ERROR");

        JavaPairRDD<String, Map<String, Object>> esRDD =
                JavaEsSpark.esRDD(sc, "au_pkt_ams/au_pkt_ams");

        for(Tuple2 tuple: esRDD.collect()){
            System.out.print(tuple._1()+"-------------");
            System.out.println(tuple._2());
        }


    }
}

这是错误报告(所有日志):

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Partition$class
    at org.elasticsearch.spark.rdd.EsPartition.<init>(AbstractEsRDD.scala:84)
    at org.elasticsearch.spark.rdd.AbstractEsRDD$$anonfun$getPartitions$1.apply(AbstractEsRDD.scala:49)
    at org.elasticsearch.spark.rdd.AbstractEsRDD$$anonfun$getPartitions$1.apply(AbstractEsRDD.scala:48)
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
    at scala.collection.immutable.List.foreach(List.scala:392)
    at scala.collection.TraversableLike.map(TraversableLike.scala:237)
    at scala.collection.TraversableLike.map$(TraversableLike.scala:230)
    at scala.collection.immutable.List.map(List.scala:298)
    at org.elasticsearch.spark.rdd.AbstractEsRDD.getPartitions(AbstractEsRDD.scala:48)
    at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:253)
    at scala.Option.getOrElse(Option.scala:138)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
    at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:945)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
    at org.apache.spark.rdd.RDD.collect(RDD.scala:944)
    at org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:361)
    at org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:360)
    at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
    at Main.main(Main.java:24)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Partition$class
    at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 22 more

Process finished with exit code 1

日志说 esRDD.collect() 是错误的,他们无法获取文件 'Partition.class' 但该文件确实存在。

【问题讨论】:

    标签: java apache-spark elasticsearch


    【解决方案1】:

    我遇到了同样的问题。我发现这是 Spark 和 Elastic 库之间的 scala 版本不兼容。在我的例子中,scala 2.12 中包含 Spark 库,但后来我意识到 elasticsearch-spark 连接器仅包含 scala 2.11。

    我将 Spark 库更新为 scala 2.11,并将 elasticsearch-hadoop 更改为 elasticsearch-spark-20_2.11。这是对我来说很好的新 pom.xml 条目。

    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.11</artifactId>
      <version>2.4.2</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-sql_2.11</artifactId>
      <version>2.4.2</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-hive_2.11</artifactId>
      <version>2.4.2</version>
    </dependency>
    <dependency>
      <groupId>org.elasticsearch</groupId>
      <artifactId>elasticsearch-spark-20_2.11</artifactId>
      <version>7.4.2</version>
    </dependency>
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2017-02-04
      • 2019-03-30
      • 1970-01-01
      • 2011-05-03
      • 2014-12-24
      • 1970-01-01
      • 2017-08-28
      • 2020-10-26
      相关资源
      最近更新 更多