【问题标题】:Unable to submit jobs to spark cluster (cluster-mode)无法向 Spark 集群提交作业(集群模式)
【发布时间】:2016-03-08 18:08:45
【问题描述】:

Spark 版本 1.3.0

在集群模式下向 Spark 集群提交作业时出错

 ./spark-submit --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount --deploy-mode cluster wordcount-0.1.jar 172.20.5.174:9092,172.20.9.50:9092,172.20.7.135:9092 log

产量:

Spark assembly has been built with Hive, including Datanucleus jars on classpath
Running Spark using the REST application submission protocol.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/04/14 16:41:10 INFO StandaloneRestClient: Submitting a request to launch an application in spark://172.20.9.151:7077.
Warning: Master endpoint spark://172.20.9.151:7077 was not a REST server. Falling back to legacy submission gateway instead.
15/04/14 16:41:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Sending launch command to spark://172.20.9.151:7077
Error connecting to master spark://172.20.9.151:7077 (akka.tcp://sparkMaster@172.20.9.151:7077), exiting.

【问题讨论】:

    标签: apache-spark spark-streaming


    【解决方案1】:

    默认情况下,Master Spark REST URL 位于端口 6066 上。 因此,您应该将其视为您的主端点:spark://172.20.9.151:6066。

    如果您转至 Spark Web 控制台 (http://master:8080),您将获得集群各个端点的详细信息。

    【讨论】:

    • 为什么需要 REST URL?难道我们不应该提供通常在 7077 端口上运行的 spark 主 URL 吗?
    猜你喜欢
    • 1970-01-01
    • 1970-01-01
    • 2015-08-20
    • 2019-05-30
    • 2018-11-13
    • 2016-11-06
    • 1970-01-01
    • 1970-01-01
    • 2015-08-21
    相关资源
    最近更新 更多