【发布时间】:2018-08-06 00:42:28
【问题描述】:
我写了一个spark作业,主要目标是写入es,并提交它,问题是当我将它提交到spark集群时,spark返回
[ERROR][org.apache.spark.deploy.yarn.ApplicationMaster] 用户类抛出异常:java.lang.AbstractMethodError: org.elasticsearch.spark.sql.DefaultSource.createRelation(Lorg/apache/spark/sql/ SQLContext;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation; java.lang.AbstractMethodError: org.elasticsearch.spark.sql.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/ spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation; 在 org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:472) 在 org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:48) 在 org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58) 在 org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56) 在 org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74) 在 org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) 这里
但是,如果我使用 local[2] 提交我的工作,那么工作就很好了。奇怪,两个jar的环境是一样的。我用的是elasticsearch-spark20_2.11_5.5.0和spark2.2
【问题讨论】:
标签: apache-spark elasticsearch apache-spark-sql apache-spark-2.2