此文旨在对spark storage模块进行分析,整理自己所看所得,等以后再整理。

ok,首先看看SparkContext中sparkEnv相关代码:

1   private[spark] def createSparkEnv(
2       conf: SparkConf,
3       isLocal: Boolean,
4       listenerBus: LiveListenerBus): SparkEnv = {
5     SparkEnv.createDriverEnv(conf, isLocal, listenerBus)
6   }
7 
8   private[spark] def env: SparkEnv = _env

话说这怎么插入代码找不到Scala。。

SparkContext中调用Object SparkEnv的createDriverEnv来创建SparkEnv。从这个入口进入看看sparkEnv做了什么:

 1   /**
 2    * Create a SparkEnv for the driver.
 3    */
 4   private[spark] def createDriverEnv(
 5       conf: SparkConf,
 6       isLocal: Boolean,
 7       listenerBus: LiveListenerBus,
 8       mockOutputCommitCoordinator: Option[OutputCommitCoordinator] = None): SparkEnv = {
 9     assert(conf.contains("spark.driver.host"), "spark.driver.host is not set on the driver!")
10     assert(conf.contains("spark.driver.port"), "spark.driver.port is not set on the driver!")
11     val hostname = conf.get("spark.driver.host")
12     val port = conf.get("spark.driver.port").toInt
13     create(
14       conf,
15       SparkContext.DRIVER_IDENTIFIER,
16       hostname,
17       port,
18       isDriver = true,
19       isLocal = isLocal,
20       listenerBus = listenerBus,
21       mockOutputCommitCoordinator = mockOutputCommitCoordinator
22     )
23   }
View Code

相关文章:

  • 2021-12-14
  • 2021-11-01
  • 2022-12-23
  • 2022-12-23
  • 2021-12-17
  • 2021-10-30
  • 2021-08-05
  • 2021-11-15
猜你喜欢
  • 2021-11-28
  • 2021-12-10
  • 2021-06-07
  • 2021-09-14
  • 2022-12-23
  • 2022-12-23
  • 2022-12-23
相关资源
相似解决方案