【发布时间】:2020-12-12 15:32:45
【问题描述】:
如果我使用这样的 spark-submmit 命令,我的 spark 集群包括 4 个工作正常的工作人员:
spark-submit --class org.apache.spark.examples.SparkPi --master spark://220.149.84.24:7077 --deploy-mode cluster --supervise --executor-memory 2G --total-executor-cores 100 examples/jars/spark-examples_2.11-2.4.5.jar 1000
但如果我尝试在 intellij 上运行它,则会出现以下错误:
20/06/12 15:51:23 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://220.149.84.24:7077...
20/06/12 15:51:23 INFO TransportClientFactory: Successfully created connection to /220.149.84.24:7077 after 23 ms (0 ms spent in bootstraps)
20/06/12 15:51:43 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://220.149.84.24:7077...
20/06/12 15:52:03 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://220.149.84.24:7077...
20/06/12 15:52:23 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
这太奇怪了。那是相同的火花集群地址,即“spark://220.149.84.24:7077”。请帮我解决这个错误。
这是 SparkContext 配置(我使用的是 spark 2.4.5):
// SparkContext
val conf: SparkConf = new SparkConf()
conf.setMaster("spark://220.149.84.24:7077") //
conf.setAppName("AirbnbRecommender") // app 이름
conf.set("spark.driver.bindAddress", "127.0.0.1") // driver ip
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
conf.set("spark.kryoserializer.buffer.max", "128m")
conf.set("spark.eventLog.enabled", "true")
val sc = new SparkContext(conf)
【问题讨论】:
标签: scala apache-spark intellij-idea