【问题标题】:Flink cluster not starting due to Could not find or load main class error由于无法找到或加载主类错误,Flink 集群未启动
【发布时间】:2020-06-06 13:27:51
【问题描述】:

我正在尝试设置 flink 并运行一个集群,尽管我得到以下输出,这似乎是集群已启动:

$ ./bin/start-cluster.sh
Starting cluster.
Starting standalonesession daemon on host LAPTOP-HRAHBL24.
Starting taskexecutor daemon on host LAPTOP-HRAHBL24.

当我转到 localhost:8081 时,连接被拒绝,所以我检查了 flink 日志,并在 taskexecutor 日志中看到以下错误:

Error: Could not find or load main class org.apache.flink.runtime.taskexecutor.TaskManagerRunner

在独立日志中我得到了这个:

Error: Could not find or load main class org.apache.flink.runtime.entrypoint.StandaloneSessionClusterEntrypoint

我一直在互联网上搜索,但找不到任何东西。我的 java env 和系统变量是正确的,因为我可以看到 java -version 和 javac -version 的输出。我正在使用java 8,特别是jdk1.8.0_251。我用Flink1.10.11.5.0尝试了上述方法,都给了我同样的错误。 关于如何解决这个问题的任何想法?

【问题讨论】:

    标签: java apache-flink


    【解决方案1】:

    我遇到了同样的问题,但现在我可以启动集群并看到 localhost:8081 UI。

    在 Windows 10 操作系统上运行集群 - 用于 Scala 2.11 的 Apache Flink 1.11.2

    这是我采取的步骤:

    1. 激活 Windows 的 WSLhttps://www.thewindowsclub.com/how-to-run-sh-or-shell-script-file-in-windows-10第 1 部分使用 WSL 执行 Shell 脚本文件
    2. 让 Ubuntu for win 10 运行 Linux 命令(运行 .sh 文件),这可以通过进入 Microsoft Store 并下载 Linux 的首选发行版来实现. (或在 Windows 中打开 PowerShell,键入 bash 以获取简单信息)
    3. 在 Ubuntu 上安装 OpenJDK,如下: https://askubuntu.com/questions/746413/trying-to-install-java-8-unable-to-locate-package-openjdk-8-jre(先打开 Linux shell,然后按照说明操作)

    完成这些步骤后,您应该可以在 Apache Flink 1.11.2 文件夹中打开 Linux Shell 并运行 ./bin/start-cluster.sh 没有任何问题。

    【讨论】:

      【解决方案2】:

      这可能与 pom.xml 文件有关。这是示例:

      <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
          <modelVersion>4.0.0</modelVersion>
      
          <groupId>flink_examples</groupId>
          <artifactId>kafkaExample</artifactId>
          <version>1.0-SNAPSHOT</version>
          <packaging>jar</packaging>
      
          <name>Flink Quickstart Job</name>
          <url>http://www.myorganization.org</url>
      
          <properties>
              <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
              <flink.version>1.9.1</flink.version>
              <java.version>1.8</java.version>
              <scala.binary.version>2.11</scala.binary.version>
              <jackson.version>2.9.0</jackson.version>
              <maven.compiler.source>${java.version}</maven.compiler.source>
              <maven.compiler.target>${java.version}</maven.compiler.target>
          </properties>
      
          <repositories>
              <repository>
                  <id>apache.snapshots</id>
                  <name>Apache Development Snapshot Repository</name>
                  <url>https://repository.apache.org/content/repositories/snapshots/</url>
                  <releases>
                      <enabled>false</enabled>
                  </releases>
                  <snapshots>
                      <enabled>true</enabled>
                  </snapshots>
              </repository>
          </repositories>
      
          <dependencies>
              <!-- Apache Flink dependencies -->
              <!-- These dependencies are provided, because they should not be packaged into the JAR file. -->
              <dependency>
                  <groupId>org.apache.flink</groupId>
                  <artifactId>flink-java</artifactId>
                  <version>${flink.version}</version>
                  <scope>provided</scope>
              </dependency>
              <dependency>
                  <groupId>org.apache.flink</groupId>
                  <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
                  <version>${flink.version}</version>
                  <scope>provided</scope>
              </dependency>
      
              <!-- Add connector dependencies here. They must be in the default scope (compile). -->
      
              <dependency>
                  <groupId>org.apache.flink</groupId>
                  <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
                  <version>${flink.version}</version>
              </dependency>
      
      
              <!-- Add logging framework, to produce console output when running in the IDE. -->
              <!-- These dependencies are excluded from the application JAR by default. -->
              <dependency>
                  <groupId>org.slf4j</groupId>
                  <artifactId>slf4j-log4j12</artifactId>
                  <version>1.7.7</version>
      <!--            <scope>runtime</scope>-->
              </dependency>
              <dependency>
                  <groupId>log4j</groupId>
                  <artifactId>log4j</artifactId>
                  <version>1.2.17</version>
      <!--            <scope>runtime</scope>-->
              </dependency>
              <dependency>
                  <groupId>com.fasterxml.jackson.core</groupId>
                  <artifactId>jackson-databind</artifactId>
                  <version>${jackson.version}</version>
              </dependency>
              <dependency>
                  <groupId>commons-net</groupId>
                  <artifactId>commons-net</artifactId>
                  <version>3.7-SNAPSHOT</version>
              </dependency>
          </dependencies>
      
          <build>
              <plugins>
      
                  <!-- Java Compiler -->
                  <plugin>
                      <groupId>org.apache.maven.plugins</groupId>
                      <artifactId>maven-compiler-plugin</artifactId>
                      <version>3.1</version>
                      <configuration>
                          <source>${java.version}</source>
                          <target>${java.version}</target>
                      </configuration>
                  </plugin>
      
                  <!-- We use the maven-shade plugin to create a fat jar that contains all necessary dependencies. -->
                  <!-- Change the value of <mainClass>...</mainClass> if your program entry point changes. -->
                  <plugin>
                      <groupId>org.apache.maven.plugins</groupId>
                      <artifactId>maven-shade-plugin</artifactId>
                      <version>3.0.0</version>
                      <executions>
                          <!-- Run shade goal on package phase -->
                          <execution>
                              <phase>package</phase>
                              <goals>
                                  <goal>shade</goal>
                              </goals>
                              <configuration>
                                  <artifactSet>
                                      <excludes>
                                          <exclude>org.apache.flink:force-shading</exclude>
                                          <exclude>com.google.code.findbugs:jsr305</exclude>
                                          <exclude>org.slf4j:*</exclude>
                                          <exclude>log4j:*</exclude>
                                      </excludes>
                                  </artifactSet>
                                  <filters>
                                      <filter>
                                          <!-- Do not copy the signatures in the META-INF folder.
                                          Otherwise, this might cause SecurityExceptions when using the JAR. -->
                                          <artifact>*:*</artifact>
                                          <excludes>
                                              <exclude>META-INF/*.SF</exclude>
                                              <exclude>META-INF/*.DSA</exclude>
                                              <exclude>META-INF/*.RSA</exclude>
                                          </excludes>
                                      </filter>
                                  </filters>
                                  <transformers>
                                      <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                          <mainClass>flink_examples.StreamingJob</mainClass>
                                      </transformer>
                                  </transformers>
                              </configuration>
                          </execution>
                      </executions>
                  </plugin>
              </plugins>
      
              <pluginManagement>
                  <plugins>
      
                      <!-- This improves the out-of-the-box experience in Eclipse by resolving some warnings. -->
                      <plugin>
                          <groupId>org.eclipse.m2e</groupId>
                          <artifactId>lifecycle-mapping</artifactId>
                          <version>1.0.0</version>
                          <configuration>
                              <lifecycleMappingMetadata>
                                  <pluginExecutions>
                                      <pluginExecution>
                                          <pluginExecutionFilter>
                                              <groupId>org.apache.maven.plugins</groupId>
                                              <artifactId>maven-shade-plugin</artifactId>
                                              <versionRange>[3.0.0,)</versionRange>
                                              <goals>
                                                  <goal>shade</goal>
                                              </goals>
                                          </pluginExecutionFilter>
                                          <action>
                                              <ignore/>
                                          </action>
                                      </pluginExecution>
                                      <pluginExecution>
                                          <pluginExecutionFilter>
                                              <groupId>org.apache.maven.plugins</groupId>
                                              <artifactId>maven-compiler-plugin</artifactId>
                                              <versionRange>[3.1,)</versionRange>
                                              <goals>
                                                  <goal>testCompile</goal>
                                                  <goal>compile</goal>
                                              </goals>
                                          </pluginExecutionFilter>
                                          <action>
                                              <ignore/>
                                          </action>
                                      </pluginExecution>
                                  </pluginExecutions>
                              </lifecycleMappingMetadata>
                          </configuration>
                      </plugin>
                  </plugins>
              </pluginManagement>
          </build>
      
          <!-- This profile helps to make things run out of the box in IntelliJ -->
          <!-- Its adds Flink's core classes to the runtime class path. -->
          <!-- Otherwise they are missing in IntelliJ, because the dependency is 'provided' -->
          <profiles>
              <profile>
                  <id>add-dependencies-for-IDEA</id>
      
                  <activation>
                      <property>
                          <name>idea.version</name>
                      </property>
                  </activation>
      
                  <dependencies>
                      <dependency>
                          <groupId>org.apache.flink</groupId>
                          <artifactId>flink-java</artifactId>
                          <version>${flink.version}</version>
                          <scope>compile</scope>
                      </dependency>
                      <dependency>
                          <groupId>org.apache.flink</groupId>
                          <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
                          <version>${flink.version}</version>
                          <scope>compile</scope>
                      </dependency>
                  </dependencies>
              </profile>
          </profiles>
      
      </project>
      

      【讨论】:

        【解决方案3】:

        我遇到了同样的问题,我刚刚解决了。

        如果你打印 Flink 脚本试图执行的 java 命令而不是执行它们,你会得到类似 java &lt;some-flags&gt; -classpath &lt;all-of-the-jars-in-lib-folder&gt;::: &lt;class-to-execute&gt; &lt;more-flags&gt; 的东西。

        直接在 shell 上调用这些命令将获得相同的输出,但现在您可以随意调整命令以重现不良/期望的行为。 java命令有两个问题。

        第一个问题是classpath末尾的:::,去掉!
        第二个问题是log4j.configurationFile标志,它有一个git bash + windows无法解释的路径,我不得不用-Dlog4j.configurationFile="\\Users\\eduardo\\dev\\flink-1.11.2\\conf\\log4j.properties"替换-Dlog4j.configurationFile=file:/c/Users/eduardo/dev/flink-1.11.2/conf/log4j.properties

        这对我有用,试一试,告诉我进展如何!

        【讨论】:

          【解决方案4】:

          我遇到了同样的问题,但对我来说它在打印后立即崩溃在主机上启动独立会话守护进程...

          我想我也设法解决了这个问题,edu在这里的回答对我帮助很大。 我现在使用的是这个版本:Apache Flink 1.11.2 for Scala 2.11,所以这个变通方法可能不适用于其他版本。

          所以我从 ::: 到类路径末尾的位置注销,看起来它来自 flink-daemon.sh 。 在这个文件的第 127 行,有一个 JAVA VM 的命令,运行一些代码,它看起来像这样:

          $JAVA_RUN $JVM_ARGS ${FLINK_ENV_JAVA_OPTS} "${log_setting[@]}" -classpath "`manglePathList "$FLINK_TM_CLASSPATH:$INTERNAL_HADOOP_CLASSPATHS"`" ${CLASS_TO_RUN} "${ARGS[@]}" > "$out"
          

          这一行的重要部分如下:

          :$INTERNAL_HADOOP_CLASSPATHS
          

          3 个冒号来自这里,第一个冒号显式存在,第二个和第三个是该变量的值。

          所以基本上只删除这部分行,然后保存文件并再次开始运行start-cluster.sh,这次应该可以了。

          它可能会记录一些与 Hadoop 相关的错误,但它不会崩溃并且可以正常工作。例如我看到了这个:

          Hadoop is not in the classpath/dependencies. The extended set of supported File Systems via Hadoop is not available.
          

          或者这个:

          Cannot install HadoopSecurityContext because Hadoop cannot be found in the Classpath.
          

          但看起来没问题,一切正常。顺便说一句,安装 Hadoop 也可以解决这个问题,但如果它有效,我不会修改它。 我可以打开运行仪表板的 localhost:8081。

          【讨论】:

            【解决方案5】:

            工作过 下一行在 flink-daemon.sh 中更新

              "$JAVA_RUN" $JVM_ARGS ${FLINK_ENV_JAVA_OPTS} "${log_setting[@]}" -classpath "`manglePathList "$FLINK_TM_CLASSPATH"`" ${CLASS_TO_RUN} "${ARGS[@]}" > "$out" 200<&- 2>&1 < /dev/null &
            

            【讨论】:

              猜你喜欢
              • 2015-04-13
              • 1970-01-01
              • 1970-01-01
              • 1970-01-01
              • 2016-06-12
              • 2016-03-16
              • 2016-01-03
              • 1970-01-01
              相关资源
              最近更新 更多