spark on yarn通过--deploy-mode cluster提交任务之后,应用已经在yarn上执行了,但是spark-submit提交进程还在,直到应用执行结束,提交进程才会退出,有时这会很不方便,并且不注意的话还会占用很多资源,比如提交spark streaming应用;

最近发现spark里有一个配置可以修改这种行为,提交任务的时候加长一个conf就可以

--conf spark.yarn.submit.waitAppCompletion=false

org.apache.spark.deploy.yarn.config

  private[spark] val WAIT_FOR_APP_COMPLETION = ConfigBuilder("spark.yarn.submit.waitAppCompletion")
    .doc("In cluster mode, whether to wait for the application to finish before exiting the " +
      "launcher process.")
    .booleanConf
    .createWithDefault(true)

 

相关文章:

  • 2021-05-13
  • 2021-05-25
  • 2022-12-23
  • 2021-11-26
  • 2021-10-28
  • 2022-12-23
  • 2021-08-07
  • 2021-06-24
猜你喜欢
  • 2022-02-08
  • 2021-08-20
  • 2022-12-23
  • 2021-11-15
  • 2021-11-02
  • 2021-12-31
  • 2021-09-30
相关资源
相似解决方案