【问题标题】:Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V引起:java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
【发布时间】:2021-01-18 07:12:37
【问题描述】:

hadoop3.3.0 flink1.12 蜂巢3.12 我想集成hive和flink。在我配置好 sql-client-dqfaults.yaml 文件后,

catalogs:
   - name: default_catalog
     type: hive
     hive-conf-dir: /cdc/apache-hive-3.1.2-bin/conf

我启动flink sql客户端,却报如下错误。

[root@dhf4 bin]# ./sql-client.sh embedded
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/cdc/flink-1.12.0/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/cdc/hadoop-3.3.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
No default environment specified.
Searching for '/cdc/flink-1.12.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from: file:/cdc/flink-1.12.0/conf/sql-client-defaults.yaml
No session environment specified.
2021-01-20 10:12:38,179 INFO  org.apache.hadoop.hive.conf.HiveConf                         [] - Found configuration file file:/cdc/apache-hive-3.1.2-bin/conf/hive-site.xml


Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
    at org.apache.flink.table.client.SqlClient.main(SqlClient.java:208)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
    at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878)
    at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226)
    at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
    at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196)
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1361)
    at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
    at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
    at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
    at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109)
    at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:211)
    at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:164)
    at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:89)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:384)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634)
    at java.util.HashMap.forEach(HashMap.java:1289)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138)
    at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867)
    ... 3 more

日志内容如下

[root@dhf4 bin]# cat ../log/flink-root-sql-client-dhf4.log 
2021-01-20 10:12:36,246 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: jobmanager.rpc.address, localhost
2021-01-20 10:12:36,252 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: jobmanager.rpc.port, 6123
2021-01-20 10:12:36,252 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: jobmanager.memory.process.size, 1600m
2021-01-20 10:12:36,252 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: taskmanager.memory.process.size, 1728m
2021-01-20 10:12:36,252 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: taskmanager.numberOfTaskSlots, 1
2021-01-20 10:12:36,256 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: parallelism.default, 1
2021-01-20 10:12:36,256 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: jobmanager.execution.failover-strategy, region
2021-01-20 10:12:36,394 INFO  org.apache.flink.table.client.gateway.local.LocalExecutor    [] - Using default environment file: file:/cdc/flink-1.12.0/conf/sql-client-defaults.yaml
2021-01-20 10:12:36,754 INFO  org.apache.flink.table.client.config.entries.ExecutionEntry  [] - Property 'execution.restart-strategy.type' not specified. Using default value: fallback
2021-01-20 10:12:38,179 INFO  org.apache.hadoop.hive.conf.HiveConf                         [] - Found configuration file file:/cdc/apache-hive-3.1.2-bin/conf/hive-site.xml
2021-01-20 10:12:38,404 ERROR org.apache.flink.table.client.SqlClient                      [] - SQL Client must stop. Unexpected exception. This is a bug. Please consider filing an issue.
org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
    at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196) [flink-sql-client_2.11-1.12.0.jar:1.12.0]
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380) ~[hadoop-common-3.3.0.jar:?]
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1361) ~[hadoop-common-3.3.0.jar:?]
    at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536) ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
    at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554) ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
    at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448) ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141) ~[flink-sql-connector-hive-3.1.2_2.11-1.12.0.jar:1.12.0]
    at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109) ~[flink-sql-connector-hive-3.1.2_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:211) ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
    at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:164) ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
    at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:89) ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:384) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_272]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    ... 3 more

我试过很多解决方案,比如Guava,版本都是一样的,但是都不行,有没有其他解决方案?

【问题讨论】:

标签: hive apache-flink flink-sql


【解决方案1】:

我想分享我的解决方案。 (我的hadoop版本是3.3.0

  • $HADOOP_HOME/share/hadoop/commom/lib$HIVE_HOME/lib 中删除所有guava-*.jar
  • guava-27.0-jre.jar 放入$HADOOP_HOME/share/hadoop/commom/lib$HIVE_HOME/lib
  • guava-27.0-jre.jar 改为$FLINK_HOME/lib,并将其重命名为a_guava-27.0.jre.jar

请注意,步骤 3 重命名 jar 非常重要。而且我不确定 Step 2 是否必要(也许您可以尝试跳过它)。

【讨论】:

    猜你喜欢
    • 2020-03-28
    • 2020-09-06
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2017-12-28
    相关资源
    最近更新 更多